var/home/core/zuul-output/0000755000175000017500000000000015157245373014541 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157251734015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000340257115157251645020275 0ustar corecoreSikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf ?kYI_翪|mvſFެxۻf+ovpZjlC4%_̿f\ϘקjzuQ6/㴻|]=ry+/vWŊ7 .=*EbqZnx.h{nۯSa ׋D*%(Ϗ_϶ݬvGR)$DD D~m{]iX\|U. $ॄKЗ/83Jp ώI8&xėv=E|;F}Zl8T*v (6pk**+ Le*gUWi [ӊg*XCF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5a|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'P'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJBR_v'5n]FhNU˿oۂ6C9C7sn,kje*;iΓA7,Q)-,=1A sK|ۜLɽy]ʸEO<-YEqKzϢ \{>dDLF amKGm+`VLJsC>?5rk{-3Ss`y_C}Q v,{*)ߎ% qƦat:D=uNvdߋ{Ny[$ {ɴ6hOI']dC5`t9:GO: FmlN*:g^;T^B0$B%C6Θ%|5u=kkN2{'FEc* A>{avdt)8|mg定TN7,TEXt+`F P |ɧ<Ғ8_iqE b}$B#fethBE;1"l r  B+R6Qp%;R8P󦟶Ub-L::;Ⱦ7,VW.JE:PgXoΰUv:ΰdɆΰ (ΰ0eTUgXun[g, ׽-t!X򴱞_aM:E.Qg1DllЊE҉L ehJx{̗Uɾ?si&2"C]u$.`mjmƒVe9f6NŐsLu6fe wkىKR%f"6=rw^)'Hz }x>1yFX09'A%bDb0!i(`Z;TyֻΗ|ִ0-6dAC5t[OM91c:VJR9&ksvJ;0ɝ$krogB= FYtЩOte=?>T&O{Ll)HClba1PIFĀ":tu^}.&R*!^pHPQuSVO$.KMb.:DK>WtWǭKv4@Va3"a`R@gbu%_J5Ґ 3?lm$K/$s_. WM]̍"W%`lO2-"ew@E=! I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$wMm[eG`̵E$uLrk-$_{$# $B*hN/ٟPE[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэo& ^7He">0]S֔(*CjaS:vp/Χ6I*x8*EȿQa[1ŮD3u8j`B59qU}7`upHЍE_fNTU*q%b1# `ʗrǚ8,ce)F74~'*z[\M-~#aSmMÙB|7Nnʇ)bAg`u2߻t"8U [tJYSkUdE6c91ƍ )0W4VQޢ@>VW*Q X5G-nv]pUO7}&İȖ0A~.$fhnqgTĔO5 ꐌSXzvy' (OT19U}W3QvS}ll>,ŰAVG Y%.9Vnd8? ǫjU3[5U)OD:*ϛkS1z2*i%^NEN?  jxnd/)OG:خcX1nMaJ/t4J\cǀWc+d4M"?|(>"S ʂK 0`~5*/Fg_UYt%6:Lizj| UGƶ*i Ǟ/fT+uIVևg)W RzBޞ+q&/[ڲ&<C ka<A+U 3G`vqmJc,Ih: ݪKNwN2taGϰZ' O ۷> f+<|d=`h$fZj{Q 6Uǹj v R*ߙԘ1O$EAV$;La{ Kc2`#NXu A"mkRNZrZ;s0d/#X/?_םh 2zGOd V ^g0l"?9Ǫ`mz$qwsaƸ)?V4wEʰϕac$Xl'{?]C؇բ+U=aUKV3{'\ 2~ʍ-`u|d-`ti96#%<@fpѻKQwbsAPQT]|XN X]FbKjKdpL U6f¹ RT65Ovc `t۸ƅ )  } _\ CaL=m*2*q>CItᘜ*8 9mΚVz]T.C$-cEp._0MX`AlF,@U8& u,—rwNxgs˙mL3,VIRX`UbkocH&CLlv` h)x'``SK_ѝDL 09m Bu -)w;"rttc‡-5 =VrPiEď}7dr\aD;(;Ha)]1-{bt Ŗz6H+sz'ˀ*E%N3o2ȱ^%u~-B竟} |63 Z.`oqD>4# /P4Qz~j3 .-cEpR?H0F>nX9T ҇:W?*zfw*B#d[ȹ|7NKLjp*: gSa sy9ViCb!bstӒNc_:F*`Nv?oϯ +4 {cvJ697@28sMZXc U+A_Aܲ'SO1ۀs`*f'[8ݝvjҹJ+ 0v3} -)-ٍA"pZslԵ+n!e߯lɹL V3OK\aΕ [Q =rvQu0 M.1%]vRat/ Px"3Pc /;?& PW*3'X li7v-W&)cX |]O;C%shjP CVr{1pVʂ4 ׁG- z M2X^-kOwăJO3tsF``tɪj{8rhUui')D*r0.8Q;ІWT-aQ]§K~ͦY;c؏UXE4S/8 7ޚc0ф ҁx&.L!UyL7,+Bg#[3`pO^ޠeoB4F\jtctUb)L[3M8V|&Zz/@7aV),.A[5T0G*-Zz_zG/S[*"꫒? `a?N6uilLng/EU6غA>?@PqȅlW11/$*0@fXznL9:G qcuו8`n{avc=}OR9~go^ˮL_'s%nU+U+ȳbXv}i@djӃfb -u-w~ rcplK;ֽ#nlmu[`wdg\2vLfT F*OO '+Eo7&DqJ5OU!F>jstKӫ,ce0!Y-+SxMK-H_2:Um*corD># 6pzdMįMa!IZ5>H?wfIuRuݗ㺾|47_eys5~WEqws]]|-ǫ\}-J.MLmc ;7ԗWrU}-Ǜ+sXn[>eywyY]]¨Kpx c./m_;_9ߤo"Th l&8֨OreXMlk)=VzpO֠24h 1hi D{q>~%̈v^nBi~MefZN >:/?ǴTߦcr_$dts`3 P}Kf@r3 Zj/>|V ?\§MR3V:<}xXh//T+moY5Ȯ4/b``U-dBzqV-NBk1;HC78E@UMc8>`TZ޵3bRٺ*nmڜͻ-p")['U\ן!%;vGmݨYl_p83$gk$1Z)HMRY$i\%uR]OX&8H0VXƻ|Xՠlqe?V]xDcUJ6QURSicS-1VEELZ'0ZhhUݬϛq{\~)ϣB6Id 6D̏V LdZGe0Vۜ~ }mjӂzܰA*ܶs6(Sz2(js4vC(uuU#[ ?D4GdzfÚ|Y3N!h'97e?2'qZ\U@Kʰq.)[aء^b4Bbki3 eqYA"xY\" ?7W;oD"!#,Ǝbfl[agmθ[a1j}bƔoivg`\ӤQ$+FYT͌~v +ΏBtU,_QSQClw{<9Qs?ɴe/2 c׷9}œv_na,lG ϊ"$~STC~%C| 8g%C|Z&0[|($4܈iP}Ō#yD8g]itT!9nsp>Nm.Lv/L /oq^/<*SV5,?b@h xi_y|do;8/!OoiD/cqL}Ǵ/_"J$T込F1KcŖ0a0:B% Rd2{B dq?ky₶Œ׋$nf~l3℃z}/„P"n^XΓ|@%B:Qz]fi]aCS\%Uˡ6ʪ g>b7|BO8V05mq=[EBbysy$v̬8f~6OYy g]iz2Rˣ" ﰵu(ȼhY"{4Q{X煸qXrdt>1kFʛd9/y"'\m5"'%hUMEѦ >12JYI\šԥ`|;ꍀe:9bh~=`e=Sc)ϷrZқTg&$e*+L-I+k5)5MjR|YWfA+>0kOELd~At9IS)Jd Nix;yghFguvO(Gb@bTtMϓ¹~Ba+Hk1iL%ûA5yTL:Bk;Kxyi+lc{b:kcs?Iw`:/^ۖ)%PC&NQ ĝ G4gi1Mb)g.:jWl{9Y+bA6^a%c=is2sw'sp+$*mdY΀?eZP~EwI^'ESߊ -ws ;*?Uȳ. sww,n3P(Y8@@+OxWmy&C9/AKi6\Ä1Xl˧cUwa^tCAv !Ǔ( 3[&:G]?D](8N8܄IۤQ*z8-Z~ 9(aSjrP\AiGJniI7 `y;;9wiV P`+o{tvTAF)t|xumN+OiqLuP(5ZyM%hv[G/FOΞn !&ڍgG̻"}S.0Rm=1oQS#]gGD9χtZw=j"߀kK`w5"xA8j {g? ?6*GmKSf\%BM"O`^M~?֯.6:M0+@'oNpG'.|FI}YW+Ło( vuu4Ulѫ)``axDO>F2݅(Yp" ӻ<" E@J6Aefz%U^i^]A]zkIl~wRP(Ϻ 7C_Y/dQ4+S##$/ؾbU[9!qw> ͓/ ID#zA@avr7_Pi F‐=]ǾJm`yZBw%h,]Gskz7DBDFKL.z> "V2=D VsDq:鲿.E*ԡ㮮2ysy ' DV .;;4^9UNo?xAfRPZ waPyx`Nj5ҴhB$eM G׶5t"͊6U)ǃ2O7Y5bI4YTiA4ťȍk- i`-ၝ5I`U!kG),jG|Po| E@8H8a$\> e,\C#.:֓g%FȆt%pZ_I:jXCN+j_J IdR,L`Cgi6HwE@ te ªΈ)5- )[jfw 0ԉS:Lq%">4iW Xm-.(gH)8Xse4hZԬ[LQ9n-GBWLjYvv,qwK"lXY$Y>LPCVU1*o,x lsgCd ?g"/fvQtG$tqNl%e#Ldl1i44u=-"m1ܖ` ̮r>]jQP]Zqk`9ptsKQ[%Cj[7$.hؗ+>DxĢnJVuZ2VD ,Eu՞qg؇>6["bM6pОZ>Z?p(|-H^n7(W-Xt}FR ?@ʐ[Sdćb.N-9mĺkn(2frGC!n Mc׹ t}x𥥷JfG~^ VW4[.%e?(Pm3ҹ 6D,R񥨇6ݲ$Eޮ".a␫ .tBtGpWkMCx%ݵvCqD/[qt'~V- tĖӲfΟ^.4 qGG;ee.t!c`G#l?t-; aa>r[H=ә{Z&Bd漫 C*L0ѳLXl#GHUAqjӽa`C6(".㙧ɫMSP,6n>tI7~gXD36Q,-yev$kXe\AK4EYvem=BzkeeX"_B9Z%qu7Q+/.q23JUGGf-Χ)7à 筃_Uo"˰EM{9xOUܫV`}mA</Q;ǖE$ "8c t(-kxL;:-UT&v{`UP%K75 #IrwKDoңlIhn!B"o7!]qŸ!ޚ߷;!7 ~xW B1@-t ɏo*LBsw1mqP(q~$D;|$D*õ4&gAqe/pgGU2"%0Zh>L]4dK_Zv9Gp^>H ޫ־I̽kb]t?ʙ'ʀ5! Rh{,5o/ ?i+&PpoU<(lF?U*fGh[FI&oBhZw-Zޱ-R4hf&7ɴ t,ɉؖd籃=lq\;T&ݶhH||</J,LO! h)Ҡ{ UCf`V%mZh>2O<v+!]y\QN̰o S0ds*"_'ƌ܎7"yٕ 'Pg/oީ'bm'i|"po2b AETL ]''~gbˣUz}. 6c>MABdR*-WS ֪TُUVpRid@?M2\|1CM{!PEjsć@]u4+?>Q˿.xx^ο=Hb>fު(F*`pЪڳKUΆc$V]^ ;r-ȢǺ @ n~[;uZ`B!| |{ʺ @xm7ߏ?t "EZ0yÌG߱}ZBK]nx'e&$ש0@v$xd x2` ryQCGzkJl1` n 6R3wPH; QǦ(*-G:WX N K $:?-C GXbҪ@  (xX;a$ަ8@qD('g@,S;1$o5}@10z@Q׶phbI,iC8ӎd`͠ hy0n40np@o5^w^uq#w(^gP{#kN$TaM˜$Kbtǘ;!fndsQ5'߮=sc<ɱڢ'X: BzBv˯jp9/fu4aHY )ԾZE5q[t*C߬am[3?.G8̃k}^?ǰtޔeQZ$+ksR:#haӮ%d5 &.T>V3rf9a^f=ƉF.ÆNj#+V$SPj.aE(bn$" @(5+r` mIs3{)ܚ(ѿ K dd%ńXf%*gMܐ:R熡[ɴa K2\;꼚HI0r`E/n ]7Sz4}:v j.7a#Jڷ1ʬʝ2ѮtHL1 fGy$&ŚwHh";P y2XѺq?)4Y3M>icĽ-lP~#&p}ދ%*dT 2I鴘,Rz lS-YtqɺgU4"`Yҟˮ#ؕ|Q Iznhl_5_}"a̓q>Y?p4OdG *F}cjn#[Ggyo:te|JG9qtvLn()e@9<=6[xv7.`$)w|Ê@ufFRaӧDIu,UI9i [[ǏǯAGھq/a^^zC'z>txshf74JqzQ?9g۲@U{fU4tCpa>2?fҡ͸&>]w>+yt_"D_' 'BedI>A0FE#$\2ՐSfxoV`լ։ۈo倂իFVfS`@KgfxOlQ&(Heh3r8髵5C0M_iZǯk~19K vzS8-ފn CO@NIQcC'q<"/lU{G\7_O"*ᅨVp6{ ǡd괸EcyzOo߿O14 ;(zjo`M3ϷuuB7Is!PFY,{Yɐ=煣$apnN\iJ1(@-g=8}~aF3M 533X_1鉚z4|/ 70sQOqaSwX [zsVQw|N qj0Iﮖ4).ڿ+b[n<߁=g0NP6 :Rl E$8\g T/?#J6!σvk}ƻ/'LwWSMcQ:KFb >&ܰewm (DF9XYE uUP1\7bPE>m嫺8ڣƪ9q۫ajg]<"ۏz<ڱ4Vm #֞_0 KfڃC[o2k4`Ǒ!ѱ*nR;~3(f%a]/[-Zurq lc7Ö{:_lQ`c yDTe==EُSHU9>N1ܗ*:PQjMڸ_~_2}~_JJF(ݓP kCJwSǠ'*wXN>;>CԱ{pw]֤@,۝V4Rٞu:;N4B= uv 7 ;w'?P'|B&>PwOBz;N4B= v o@;$߁РIhO#4ؓ`B&>pOB 9FKe _H-S"[|gF.s\h;YTQn#c0^XfeE2#hZMx*9AD6qۧo8gl^]]z| OAfe^e|GӠ> ?g8 %Q,9Qul-*gy>V\)"`(`}>8'.RyZ:X&hI="[i9F2īFRͲD) L ץNgw;pa|^4+oOR %s20cH?EૅooE^Y*Aܑ R_$$0<~u+tTI㠼P;kVnh׻ l}1۞@:de< xI&"f:xo`@.jߖ10ʄ'ClʂUx I =)i|@}9ʀ !XF g跗UCӓОޭAz/!=ZD-Lţ"=|dv-@ =9`|@:6^㗔gJ@5-]\u{VQ'7rsFE x2 [6]vtLH3X3`f/-8/냇eO\:(чǃ ;EU޲$Xdt:n0"һRj\xe 1zE:(ӞeY"|jMwFoKG(kᑱ6@ c k 89(3Iu?M)W!?4๛8d"JO Xƒ5տ _o8Zst/La|ŔW 0[Meb20A}.&Br0o5kxAsb(O?}`wu ]"V]'༓ MѾCn븨xW`$vKU;"#< +s  𚈼d'7_U'8LB:o?cǧcokg>F*̨mdmŸO<;.Y-fz6^b ^ᴇl{yqY4}Cuo>%rӷd}" (wC?gU>%j}A/tVu4 (ᇊЩlI %]_)jI26`Ku 1[ љ굥~X#]O0k9Ì}EHlV! 뢚m)lIQe&-4: x*r38~?-nqb0*=1QbE1LbՇ.?YOe)K^r59uI4.v1w-h ~eӂ7a QS@R^.-Kku,ɄN5J2\"@ :`CkHdS>%* rx/ 6#XZtFf$b%a(M7@$TFu( =%B`ꇮy}"O]y4g*z 0~1aH暩^gS`Jf 'b d7tB$$2=j2 ̙n`VɨFtk]QMɈ,"rh̏GD+i~'fj(&bWŴ_[״  kGVlCEUr-+}U*u4C(F7TQn~UqkV&At\;YjH{*m(4SkX^Yl"R4B}/_m <]c^U}>1{B=!Ro8V^,mzDCzijŤ"+ ܦ&JR,0q &k)>5/'Pj1}[ eL{&x4GOq*f\Wk*r*}%z<_aT_Xmy_2#[M1v̔۩S';};͟{Gk?<> ^ZVA=BH1=YjD<ſ<5 7ZKW7.^1?~!1~^\|pſ._㇋_ \'A2eY;j[s)~0>,vшir-AyVRl&_=-ԱpʆYl̀ mphAhnK0fv 9uSkY --EH!%;%Vt]o}O3risRG] #u3υ=g؎ Zɩz /Pdk W]oK4X0d.DCIAlKX>0] 5ύ+U \ &&)Q:ߍuwKÑbǻ|P8ZXn9=$U$n/2Ve 16>txvXY ObJvR,X?hFۡ;rj0h k_\T `'cXx(Qu&Aj qKz3˘Jdes晭9/5%P5|{Xp s 76iX%AK %h*yb#b؝GkD<+wr`P|j;-3^{g~վ _Boi 4~* 'wNrQNf_W': #i-t{M. ''?$>ѥ2{"Ӂb71.?(gyYohrDGD0KU=6 Zg!).94d r,z1U\D)LM09 9%Y@0m.R(}5jqC&KY.M IF*bCt\ڧ̂ 9eakD%dt6F.]@2Li=kU@QP#1:GRQwbb(E?n~9fnʬT  $EydM1EXbv;fNO<0)!͂͠Ky=հ8V8Z^x$3NO:(R[t݊b3̙|s߾WҳKtBR!>u2Vfz9U>bg% ,LYjͭ HUkJ5UuJњ`=*4Zg}Wg&§r bd!B3~65"%<"fD>aDt*%*3!0~K,ޛw# KXQR`0-QIk.:Cĺ&|s2ƌQ3c˶-?绕 S,X<zjA/uBUȹ,8FqVy!#&o,6q&k쌏]=1) 1Z1v*)Neqgqʲ5qT& 4MAO,N39u__SV=?\=noXpt585G>9T3ee'&կRR]cHݧ? j.*-ЬVxЧR:4q bRW%|CMi$՚^\Ƭ^(x%3M:!.da%|0Nuel1['}KzMLK%=e}N/VQG>񾲜?;SyPuqjF g"<#M5`Z6<~ ,8xVRg>)EYR)~lo-Ak3#nsRG] #&K;n2d2{[ջS*6nYg޵.4x?.hDu7yމϒbȅ1Ņ̞Ӳq*L*2T)Ī?,*p` j._l))u2E'wO|*>IaVEz>(l٬nYR_sp>\L:ubQF)x"ٕ_˜ bG[H: fC1['/}rǂep^Fq^g|~~`@ViɢOLtYXy <S5fmQkIJf|' ?6PSf~ne۠ 518I:T-%)3y]ijb YYedaSEAL( u.|0Y,(GF_XpHZO br^b+asANn:J& 9Y!n1к`A0r!D 8YR Voe:>`5PX֢2Q,2fqrV2y[OG^_q$m~}yhjbO4eSVR+#p [8` z^{^%X5y vZF_r/nb Rj`1wvox' }93Hb~ڌ{;>*Φ!g޵+۠D0ƞB5"~w.9&oJ1BE~yȂI|͞'bt9t|fګ t@*ΐvD"/0;9>U\NNOsSbrڵ|HcvA0RmDj+ K v/մ n=Ybfv~Ǐ]z1ItGǜVK- è6[6|W: O/ho+mFJC%U< A03/;xفdX/ 6Yꘉw+vInIV[Y"yɎE^vG*8Q(%mZs:T u&/\TPm4 }TԆEGGanWIVg;L1*J5w[Geqqۍsݛ"FplbP/oյpE?RLbϿܽtFe4p7WQ5zw]_ف LSdnxu{bm&ЇqU1L&/gMOPC֮Aqdb^ 11dT rTF颽,| Ivc/Q)zyþ{GNzi̫j8i#0F.b1v C|O߿m#bNwa |sJl/IoIۚ70*oS2nսn84%ל_RŤh=3Z4vIa+ea7)KUqOd Kq=I41FSy1Nn??$H$XMI$[I[?94%T5p9?}.&.Ru( cRSTH)N8^wznW<3u|lr4*pNzHA[2B ~GYoo&٥f0 ӟ`˛ ?p$slpZ.ik<krlh?וU{1<*HWa4>6} %9΢EI~1dDMߎO({r2E#tQCtPK!xzc|=At0_\ !ȅP 3$CJءTӲdw$Wz:^1s>WD UM>{7 GaN஄ٶ?k' ǓR1leGϓ-4?Y)mSQxPY&}REU},4-Io.ICfm+Qr|EOᢛF?+8^nLYz%CtG y|͘Xbe^5k/`@0> gNlFi?EV^ fc?o~n­ui]:yi/=nDN%V>CZ; L#a:JM9yWs t&o]I,"3Hiř3W?L rj]3s3θA؜V QZ^\cvՌ!! "Mߐ-|+sIt@Imv}W{;D#H8˿M )Ӛ|U噆U~! xuek1kKrmv9Ƀ>`eqFf?}*'+Uq|,vANO|RǨї޸{p Sq~G=ጽtQuC܅PGg{9LQ֥YXdΒJv%s> GmWiopH2?JRA K&=>x-T+&i57b -([hhw: (6`Cڠ{Ka-ڎ6O S %ȏ@wXťa N}Pfa;,vTqtD"IF S't*Pmu[VYrUxo 4J: : ̈ni6pѡpmWȯdbxк[1Y_~lpt &f?uC6 0@z,3K W(2фBg1QZX^v@L'T}9f4KW})8)WvŨVj%Kir%1h, ,[\*7'`WW IaoF%:i ujV-j=ؗ(mwETa|Wtj`Ïauu]𯶰ylf|t"A}uiK%}i|V 9SROn_vf`[dBbO/1*No-ZPЭѯُ {DI{ٵ-Zxy NʒKXnN~iˮWM#lK#ZhaG7N m:&z"z?OYrO}qM{nj1P*t ̠;'qJ{p:-MG[wx)@X75Ha2^Xh Y.XIt(s®m;jm;o7, Msc yμ G, U\k1( @ʚѶVѶkԚdN3TtNuKJyYfI}.A\{R/x`uhQ;hy кq[%Yf\R p HVqGj r"eR(ʚѶVѶG O#ي;$TIufw>[O`ԧ'y[ב[L٨E. I?F4gYK9&LO:pp1<*6Wa4>B ?yS$Y#>=|QgWd16 k7.|~qULNon×ǫ&K-)݂Z[uf+i尟mvZ^x2é\C% "G8O/$MktEK/M9|tQ2PbH3av=Yke.'`</@0ڿ&SƓ#q3p?ٷ?ڐI9Fxk^on܏U!N,6ܿh>fdeUmj(#mЯf[K_|0|+.,] mDo&Z\ YuZ+9NW&E taJ~g%i(ٹYUv  Rx6E!(HU2G~ӷZ{jǥvޅt41Jm'햌V[8 -WZup?Mbyp&"FΕep_IM$1:pC(YH*EvwJϽ480F=R@SI8op'کF8D?kqqɾl6Q/?D_j2T ]]rhvݷ4W[|Đ>Fj$5W|9i4m/W͏waVW8*9%u a Ș g¹ŏKW[JyN%&)XDhI*:ϭ Rp?L5M WO;8eF,T{<=r8sH@ .%͹SD +w8KHz1"-`IOn8ZyhXDtC܆tֲs k*ePo qIrI BH6b`LЁ q"\e9z; ܪd*-qd$!zB@R4NUes!u2A\XJK ZB)l6Eb=q8YsDHX"IUw1z"j$Ě@\t>Ztɣ&ZOA&!RkV3k9>R9f|N\c-\Mt 12Qq zy; nd1A*zWޕX8`i~fUqrN弴6=MZʠ/SsF}%KF:oXnSu6t0qat1]PKA2!^ii"ӹ׊kEPV鴯q;6L% @ CgwVKN\6ygm$>695[ {hĸRh4P @u}n* G 5Nqas92`&rONgg wq#"g^K}^(puE N/ odK}q6HlY^=!g3L2\DjW0FuxQSN@9ׇ6Ob))~JPВ(*4i19XO~?/<3g}$'%oڲn=@2pJğӖgg,٩Ӵ"8-1Fd0#'R)ܧ&)ykJKc%"#ĥ&3uIA+6q SĞމG/@4r #bPJiƳDGI[-\fBQ$k8u^PK:\l-6j|YR ֬!7&kB4BJ4%bռF>N>.[xNyw0`sϻlLI(l Q dc)mlR4i6X͛3,Q2ڤx2 `jmsfCY7Д $nkfkZc&{ oKB҈TLn+s% /e ?$#]2|3WNtl L!nNB:VM9OB&i AL)v˜1"9Ml*uN+l2c e$NBd"V(5(3ppUpZ@ psx+M:(5Jj]}6+: ٧;s峍YRGwɳAnonie=L2գo 4_oO蚹ưJ͋We{;~fJ >*w2ߞt:EGr_J4LJL f!?:x: xOer&&9=(Sa A3v(1RrP5wlZhR0`e<\99=maydEm# LG($y9Ȝ'9q(x9 H#QX`2Esk W%4WTк2|3[\~ҙo(!~zrwu?l2#X|G9=D֥7ΗGez=2vݨ)bX$c ~:z!MpM܌: Zi22\.BN֢9!| m"p)m"? +"{B0<$IRP0Ef4=eqx s(VSЦIlbs4#F9\rYo J?Y2-q^ϓۤ.n|z[8y|pYpy^$Hvdt\yEIITog| 2:_v;~l;/9ȧ0Ue-a9~Be?|OOOJ'&y~Y^M:/S.Hi߄?be Ip<^/7+6va43oSC5L&g|t<}N6Hi_$ך91fN3ۘ*=*%_0m0_2)qmܴ=|i)c*I22m.h]^Z1oBV6\(-K&ΓuNn''!j0Fe$[nc*BM'^ǞO9gB93p;\PG75%K f0_;4%` 7ߍ>p4o%o;ISR?e%v}ݝλ'&hJQckס~*%󹡖?kZ*oa8kWIMaJ̣Sf>&/OS5::wx]|Y+YAuá~ُ܍3E9-f&>-O/"UÿJn~u/'?]>8S>spN{/ߔ7Mɛ?_C?$'_^,Aw'MO/̯>N~,/2 ,]'Q"3DdpW7EoFhJ4"󿐞B=6C^ I_%Ór;5p>h2Kt1/_{nF^cĹ0?{} A{SmznjAڞP*X :tӊǔT?-Uxt&߮'UiUʫs3LAYy=Ct]=AywVJsOJ3M 7T /\(.[wd_TLߝw 庼!v^=Δ ~CIw@g5k ei`Gt ЃA|U?]J*>t6Oe5.LG}0Lt0/:{.*\k^Br 6 ΂$Uj8jwW/# p̮7NݾTO|^Zԇw7&tkbÃ=lWD 5\6P"\g=d%)hG ?0ni&λgxss|:m++>f|lwA N^z!]0z1)wj(lDq"R9> *:m3 =qpi\ZE* 7eFw5Km6kkBqKז5c &=YJjd<`Mc<\b$UV"iyYe`}moeG(򸆍zvѸO>8bɞ3hFl, L.(kL۬]JG\MMe%yCМ"$K&2iusN)J_HgYzl` eE5j6Mf4f6:m`H u'fm\ . dBI:X\$$,fL8MNQxdN|dg7w8S)\j9vXހr%c\Aqhjl\4G- n- j^K켴evpg,?30%S+ӔH/.C;x[6jLoI_EчLQQt(t|ެlgsI2xy$(}*y)Zyy=h^SQ O9z8&=B#FŘKSϭ]ўBBY)T>8*ԭM>B Z+b]a`2&oKPyH1h9z60/QB@~T4D A +GՕgA#(Ŋ-G[Ī v@Th%}2= &,m\J&*qpxZ6Jn@'&J֚[*󊔵L(Zj6JSŭli+gujU04 [D +B5R}5G eYm7lj(Xʡ6dsԶr6_@F |AOvN`'"mqp){D6'*)r3U` PyO3fk}@)e ] L[e7ux4up zl½ nE -O¶ 8Ο6gb3b!&y5-\Bn >@IՖ(/Q*Ul[s45IzMH<..V(\kRG-uX%mX$a]۰ D ϨP%mi'X`?\al kbECٶ fS XT{ĵ=&1dHq)h>ԍ5Ct i?"7O@,%-l}x=*PS^5(.Y6q%E}p*a 2ld[5!f Pѯ7ߏ] ~U0Iɗ1|YW5FЁ_³ VQ8w>G~!%JI;6_O7 S熔MbMrԢr)låp>q02r bSE`IK-4RRAę$f7jr+D ΅Q% n=\yYJ2G9G +' դ% EM%Q $2m6±!a5Nᨡ0"֣`r? Dzpp-~Y) m,Xmd 6>qR8V"=҇{%F+ &DPaDPp {(xJFUZ: t<Ǥ㑁ժR|Q,k^j!X $ LL!`~qZ0vN^ XmDuRP۷n] =wnS+,җB"[. >̡]ÁB+ LexH&rp$\VJ Z jjY:XWҲԑu@&2{_1F(Ї`V(q9ڻG61'x;$cͩ&f7S4I +-lZhH$*A -,9N K)KLB\Bq3FGL109Q$zdLʘ]?@WfN x6ǃQA ZqSy&ǹG>Y&Qb1~c L:nbQ8&g2r50QpmJ$nLv/VP1yyٟE/Sҿ Âu i"+d ק58p8YsLtA-a^r0Z9}bA3O85ޗyWVہywI~-Lh +w x6{2p~^DmW_](,FVj=8 ?lXizh*cUÄTX( }{7fGd |TfM25t rU-5*Yr Ru{YW:ZSQ%˝mw*V>ҚmXp hfV2jbkVjN|pm5mYtT,(lrY GE!YՂՇ9>||Wx[ wLȷHf,V l8 _wIIb^Z=jrfzFӢ$M SΑ' ^|4:լ7 IO䃤 'A|0 I5ڜI߆‘>H7j}ȥt\y}2Psj ^,[S|?3Nt4LivNӆK'ΨNeT;0] ވc뫆KX::?",od*LWeQ>!" ׳=Gb={NcǷ: 3Ro}]y9Kۣ7ksqI]2ޘnGrlτV&{q@d:|n*^"88S#" x\s͇vj䚋òG[CBo_mH694t.e;\ҀžΥ Ľ!p} yNK%ôBQ, BBם s1 !2fo,iI@Q{^u<aӳ|aHY9w4`oisPG݊X\me-l/EcFmLEdtZk:>l+~>P=cy L!`_0zf0߀ x:ЍD|Z~h oM ^A: _6̙6Mi0D63$=wW?ToRqyDvTI8ፉOo0鉴:ٶ2LNJp:m[=-@2Y۸iGOM&ƭ߁G: Oֽn#>"A{j뭖P鱵@ٖ= tV aM|~Zm Ԁ5M,>c ad1dA%AjƧf1\ㄑ?8:z;x%cL`(X~ #޾/0V̵}rS,OᎭׯuXXV?\u]K|;M y 5G1R2o]LtથI٩NPֲl^zPu2 T;,sr6L Xtqf90w۞;f'lR֖FA6%dٛe^"oX@9j6ެzRQEԑ,z=?V;0 R/?ZH,z=WgFz{P )ެzRp)%)Yf{~ףf@.Ȳ7"mCOC*BͲ e- u~ >/hS=?+ ]" ~Reo*{Ҡ :UwP%Qeo({2@c0u-zeo'{FAJ^@Ƚ>Yf{n׃rP>@Yf{7TQ"9$:&,{Pzڎr@UȺʲ7s!;=`dSfzJr@{Yf{*p0eMjA0p}ʞ[ZTէ mqs:P>eQ]|E52֊?Mo'GȠj Z3577 >S|׸kS]^]_=_w<ȟɲ~_C[#\hMwE|M⻽T}}.4ե-5\ِڦ\Mњڵ tauSik*E*.* 2,K(d=te0 #Ki̠^]mA#Ư1˲X[>wz2c?mGZ[SIs%@ƩJ!1iIz&-EBWOFV07^:PY&Y$bx5|G9-QU>U&>1 Yk Ҁ OYE|z2vz`^x|zrӓin~|&@ThxCTQLY Q8 Χ/R)}>8;C8A-ArLr !g-!` !HL 1ad1 AB.3vJ$kf2_>C<1³7&I[c#"b>}J0dᜠo.!+ 'ed9' 4"  0ϕk FcuMGr`X47#Va['wgb&(,("gecAYFsfMSѲXM4m:xd#RD.g+NU 2#՗onE@|~0uZ_IȐ.]E_ zjPm4 >(uX4#[!/K͐9HlrH̒OY8gy\1mQnp #!#J##P֢G8i􅙠܂Y#[[D:6Chm7eEkzJF~񽸽-CF[0ָ!?N)"I#!:E? KW|"'%66w<̑n 0״0hsh朙k,00wXVVTƆӷ0pmMSHQi4<\"f5_o墸db(49p9pHc]ϑ?GO29 y)S9LY9*` Ee#޻F#y4zlg+)R%LY̩˲&s930I !",֊ӜqgQ޲m33?3P{eSQ#哂mΔgrؑ.<ϰdSQcUJS25Mľuʹ[`u5s%>Q]^^X őդŐe8H^4y=!isQZqLY^TyE撃A+dB\<2ζ|`yTLU*9 X5.Y[*ZfjkxD<ᱍυ~,krB4+6OBZP#ԃ6bA6@( 0PP(QH25J1 !,{vq40ec14ʦ컍 QPR4#Fг"Bt',,ѕ-sv5j8÷Օ&h"4me5يZyr[ؖEA  4) 2uQ#B=S#ad1429e%SUo28xvUc )Tض` uD(G-!*Q˱Zveh~~qE&,Sheb/T:f~9<ȯzd=2cBm4c\v֪F΍n.C%CH,1JMY Ah3A ME0q|jȐήpGWa2 9xQ9F[LZe7U7} \yݸcqg؝o8 o!ֲ#^q۱*t9>O=9}X$?ĿCkQ$2md1$b ڜF7׫ߊ0@P?o+Ynן~ Fgwwdm^da> ?1ۀ؞xSsLj oڇ#K#5dAgr+\~0rO*7LA!ډ#QdDQ9(Eɾ1!Q[%"i#!Q%QpYPSO>7.͕_v'YF +c#BQFCU j)0vh*?{FeJA= [!w1L h`'A*b!UIN܍s*%Y#UزuysxH^\l>}R%t̄mˬ0\0B9JNJ"8gRRfg9F"IXTB C9FT)Yڬz49T~1Cα/n)IQD$r2D)lňT~FDx_A@VBPOQ#!MDӐ9-G<ИX!Fx QlOT#m@< ԀДKW, [9 u*r-ȳEg^3޸DMr\Jaĭp Vg)P57l37_mWpp#9 |*ǞddjMu@S?$]升1QEQ A>3;4עrM#p, \?ͣek"C(iy1KE.tC(˄a)52(ǐc%Y#9~ve !a5  d@x{"aΘ8c1 KBvk8A2%ѫV$ؘDNX;g)P΍H,Ճ~V:cDNҌ*tZ2!@z#CxiFqj[ 8t;9HCCAl@zVԿAP"4 nuJD$gDԈ,#9W \9'hS`ԫ9(^ٚ |$C->4Wf?a^ce^qQ5f+x3 BcWRBAEz{"'h!0g h4˛)ٿ6~4T FHek"'C* va9f)IF%%h(3^mEXG1 "Ӂ"|WD(.mj P 44t )1rD"'eŕ:3C+M;2jN)8etݠ6f8Hp0\/wa'%\]~X6F˫gz[FWM9n-V]q݋^ף3`xD['khs6U&ga26l)]ž{B=1(WkXAm<3CcjՉr&r`aQsGMvЊ?˛rK>wG7 `a4.u!zKh}:CzZe -TkFϋVbO]!A?\7]'I,VgL?J;B0bަLKʅz*w\"gV\ 8 Wձ3D)qA::PTaA8ek*'с@,mpQvE'W5$Tdm3 ù哓`ccn?\wgH^$5V- ?*$cBќ"F52S9GJ'qߵ"dVv К4ߠiFk.#߾HSLPfn~ҜڜJT"D$+),r7|&1y:JSZs`s`@(\i^lMI ӵE@(D'$ cĀPB(1*t!%2D<7M7L}"321C* #Bm8iDXCs&&D >+ 傰u\ `EOh 0`bۨF>g|"D*IҤ.̌$RM&uyX* mTlb;J-(Riε~nTe~ofYg\#1g?A@Z$Į'$4Ԫj&|cHٞʩl]C!?JG Yoz#6ra 3HjEVm0ZC>BSA)?%H#~ qLhIO|K>ۣ|B>QD䃑ry$;'_;TG,D ( lVI\u̝^ipFUsOD3 _.mV2?Ci C8I 1 -Bع- 7) m|뒹( 3Ŗc&(LIef1^ @+nyMkGSlm/ kW_ے_.TYeKD?72F}iyeM>߸Jy+$y D=v1 i2K+~aϩ~oM^ăd^%HRϝ2xl@9rAF Px "f 8]m|tE_əܘ6}}7yNK/ٯ)#AѬo UVm5MP }qoT=^gח!)Y7,Z|ML8n_ iL8Fe&J[#3#kĚr|TXZc#@.T[C@lM m3E_K,Q  U,`!hQ\Bz~n%)߰$QXoo!guʏmzP\ 3D ]s07 Ò0 qqpނʍ!4g ZQc9BE#3k:&,7,V'>\:~cg6tvBY9,jW?~ 2|7ue ^2_}$ވׄFŐH"RRy0G[g ˿28v}0޲^NJT^#;S^ܫdχ T̝! Rм4Onw#RY+IކNq.bw+Bsg!/` ^ClwJϣ?w?0tI>ܮh0WnS`&# b),2 'ʙj*'s I&̹Ɍ['ϣ̦-Yj1, O0waz2ta:J#M|McA*|ap q -D[u̔=j8.m[ 0/0ORFŽc"YCٽRhh_QO|\'N",HqI9v𹟔}—YPɸ>rQV#|lkw||~ݗ1:AD^"rIXx3ByۃzF\bxF%HBǛ+"P%[Mdcy qB*qCn7qI(i}-eDQ&+1*2jɑ۹z#ςֱa8 Z,8گ7.A26@ 0wvT J.5/Wm[]  )Ô kc}'avo5;ZáDD?'!D{v5ކ}sQ Nx(u=! P̙ ε:ҦVMo $|3GZ3u&mkEӿL_ iaX(*xN*<}{2NuM `RPI9Rh<i]_yOݏJ~{FL]e|j۫v1ySS}kE9^??w\"?Φ)m/ix]U)N?ν[ZjRʛ5+7!͟w͓Φ{n .L7=5,x [)n sCӻ7JAw5w߆Mo!~w;uGN"ެ(T*/Vӥ+HOXV\UMn)Ŕ278)|aiUOW#Pb\|F>*/ʯ;f-p<+n7Ȯ9͖rKaN ݡ[psŤ/ eP4&BZ9*ҎC%[S]rgh 1AkaZh ^*vZ~a Ǥ4 ,(y 9)<ĪAڔ1 H4y<pLUt:>57(TŅ%HrW4:%=C R,HEb3L$acV#TB`Vڰ6AE6ZeRAN>'((&'ܕAxcX\lA;ժXTLf@rc81@FY0XG  .$ YQAQH>3yJS25I&HbQc3 d K i尊^f Tcz@R u1$E e2aͷʂQ!J  ȄVH]! M`ykcuk€ A{b̂G ̈́ì h2Ddbt5B`@) Y?XKi(a3XAfp !u-Vr Ȩ)و2cPHq s$e^j mYbH( :^J8Jbc.%%=( E>1ihp% Imp !Ѿk#9RQ!pPBI1$XY,+vh! 'l69^GT-6gc/dPgZmL:١[ݴb@\\1+䐜$4_=(*ˆ*08G!NpNySЈ݀m/Lodm-QE11%vuˡmu`-$t><{AuP~`J6!SV: .zS*d`1\uLBc5FN0`AEE Zʃ I"9-2e($匀%c8\ȋA N&'H` :<8B *k ) $ePOEx*@FgGEKAFXՎ a<`Qi@ 1 lTѤhN GP璃AM P+Н%@%:)F:.M?Xd$=l=*ì 0P>qb+'KZ3, OutЖEsi<3I j@e}޼(6RJ W[b#K/XHKG0եPY>/ &vd J!rR5:53t6831SRs2`\$ B]4qtSE 5Ihtd lT<=5'\۳,zw2GQz(ѣ8)xNW@yUeo( Wp: <%2`v@rH P.(7FhN8W(tP"ETP=`A(y@AJ@dinA) P j1* V$9̽spB$\ֽa|[?$5QЭ b% !jYQc82)Qq? T,~ xLUڰ b"cʀ#-pTh)#Fn`ԯHkЬU2g(hI6ܓt1LjR۠5'е Ւ#;eK*:QP}}R J.ā;H*>Y+Tڠ@p?XAaܠ,%E9 Z&PZ(i䋺WzQ iq#0Q%#ph5PzJ B%aY =It+ W /A.* e҃ƬO>D R*D 0IJ!";f@jR#>$C.Xr褀Kc4D**#t5է+1TR|Z ZGøORabRL$F;ZSٙij-b1MDٯ]Qtz׳X-clg/.rx7iny]~bLFhI&c\aFcZ-^Q(mFoѨPfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFfiFjԁ冏ȨBh:.u*Zm^QlFoҨ#I֌:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨӌ:ͨՌ:WϟnerӬح~k1ɿǜb,of:ITX-:<:|\+yYn}gږ X%PϨɎAyH PŒQS&%HrIDb$`Q+iFVJ- XڈbycaT$񑀵#Q ,!!Ȍl=(T }#f+kʲ*>/n./>;Kя_cGaOuW=F$u~#KɎ"mX:~T|G3M`ZK X gj,`N#ZXL~g~g{V'K]<ؠ~~>+WeFYV3' X;B6#v VV2.#] 7ӱر]+ݑ}`j޽[n\Wֽ5vq~+>Mr!oP)?|wb2 *R5ub9T*LrYWؠ)xTuxAWk]ְnaqk32Y?=Ћ? mtbԖ'4|\c/Mw9_R+|3鼔 ^ۮ]qS=\_ )^eI&"s*1kƉ䤻I%\jTFBnU׹\东w \{ꦮM]<~?\]W(EjCyv~V+KYJ\qVq- qcQ[I4wFciYǝ8re%w+O`Lf,`HJ!%c`Jm FV R|,1豄AX MB#XB3s,RciY)ܱ$3G^;X%cY-<cY+%ctG^ vN$wr#[=6xY,C XU h錳#kYK#k Bf,p ]UE)շHWz`e|Wrqf5ۚc)ڏZvzfA|$݋>bdo?p2YK"?m[[ܞ8E>[?*LVI^E>L:og<.6vLYzv=L뫋UHor{r">{'=/ytCZ~?*/~lom~~7GQr۩377ۆʏŶX;UT;a_ J)ÔSSxuΗ9g­ggM/*ϧSRdnTp"$^[G^"Ol:%x Ŕ$KSgM#ǫyu@qmmz]_*y_|qD<9zNO|%`<0k ßM=2/yax_+REk׆}-ݞr~5KR+ćY )^_>o>SS1rBX2>&wc-ק ^OCL|UOӫ?γ9OwYz6N 5 )$FrF:ͺݕaȗhֿ\C{m}wq:}./V g8[N}./<1zK7Y =\汉{ysVZ w6*vڙf)u8xe+!HݪQ~ڠmXmcΛby;H/oygg(pydR/g=/2dC\&"bu7d7a)Ad`89{,yf$kn깴V-gXÖXn6E ̤}a>뇩AEdâi2+{ (aP1acl%ϫg\sv֎UWy\oyn PJڤrӛؿf=7~VK=7{Xm4lYD3/~'l~6Qnxs_i 739'/s \w?x?ϦW^-nwoo]K_&[r]m╠OWڱ~x XǸXSZ&tzEEr'PfuKw槛pB+3Htwpǎc/&!J1.Q8 0UJV(U46V<s 8dcE{Y<4J:#KP끳$^ !~wJe8o/ny攬]Y8 _JIY jKd=ьrN "1nj/JuEd/&1#+cJ&Jq,&dH%VfS2(x%Ⱥ"CbztP wNfAϔ 7RO, y>V End</[`?8;/YWLP'jQ.U_Rw+].UK߈ײvpϺ4D L`ځ"G~rupp~rt!9B;"L (@ѩbz<2(gs*&LlOа,K2eXUŔDQpQ z(tq? j5ۤ1DJms}F6LXDhω43\AWR3~ުf~nB7ĥ{ >~))b> AGX9仓.fhM8ҕ͞_V.͇{J7y[r%5W%Kl7r)D\pH;cxD#*r^*X[G /Y XߊiiT9v"GU &)bQ+Un=^EF 0N5Ȼy"9U$JBt5>xa^QQEب,[$ "SWϒB IGU=doWפ:I$r6 Z+;7F"m$TKwZY; |FߛK6`$K!9Ew4̱9GM8"=wЭ+G]UtS kE!YbDQ PE9TfZ*FȓS=粒o yo#.|J@L>rp-aSRpB`s ĩ 4UUIw9θ"/#E $ل/J5I*PbeRI11#FEUB'f:hʳ' *l$Moc%dtU7 T-› $r}'f蚘q]e<,ilit 7~~vkӡ{T|ʋuSl% #կs4=ƛm%T~ՆۿO(}Q e5 1e}mWvq`iqؚؼ/'*k[Du28C(e('G T5[O" s:9L 4#>];X9\+~R@nYbKODpF`0N5Ȼ܌*UYP+6jFu"(fG(fJ("$XHzSY#bFEU̜

? :Hwq5Y^ C̀}db6r4aֱ tίz~,گPo7z|E6]*ck~51樞5|SV}40E<>ºjTkcC=^֠ Du9ub"U9b7&JDu6V3V1nQ]E-pXLXa#9GɠP.,.ެO[Se͆H% Jxo.CJ1,)E~xwsצ8B s[v ٪A:~WG7̃#GI2A}1/oդGӇS8A]Y\R?bF-鯰s}`t$bm:IYDv1\F \Fw5åCǛ)ްB s8,*Qq58̧8i 11A&hLN bhDJ {gW|A];6zAEUUeP2nc"_-=$Õ׸^W7Be@Q)WQ0^<ʌ"*EfkݜQHv;ev5NV6JcCrת"&z}>nۙ?\- U$q"wbH6bFi,zvEnr~Q-8xFYD:2`eS3 2F:7$,, .D9X0瘽E!{E~?Y_s*%iF*X {Jr'3prrA; #Ђ!*M3 *w 1'\-lޢv:1o ~*Ez=`˽"iGG9`-fdM8zm;;)1w jCWSw0ˇ$|ی<Ȼ\>cLd6OCŹ^cI\49wcśIl\}M}w(zt[ǧTGCb+y*ASD@N҈A4Ppޫoy fhcԻңrQP82Fzl~kjQ[hoc R91}e)}$2=lcj& @ғcWj9f>]Gx £yf i]\ɻ-&-Q-]vα(<%pNcGes"<'>ӦJ ԱaO8h!*rc_k^>ǡq[ OUH (ŗ(b&2A)=EQڦLB-Y^>\ٷUN'<(Vʞ2:No.α6<&poD71wip38ߪEgL{9FM}BnǸ{;whe@SR ZT ^[J`:q9y[~r7YoՍmyF9p;;OEet#-]}/-K^Ap7%p-&P'aI4Nap5fgx3\~"*rtjrke.GbܺB6$y,Gma5ȻJՎfpffI X-:٪9?+̈2PᎳ|# r>?/-@+ʒ̼e1T+,6m{uEUy}ܗ-Yd_)_8 z![㎓! b\EUF[Z.]r$ᕥ#!)G\ƃϜC=#0]e4n◪E0Lj1wm``}.Ruђ*g%\~@(jTIM(>4JǏFY^OGޓ8+D؂3y 샼 =V6*9&d1cdEv%b f$#2#l LkAe%wԱ6^`4s.'kYW1bFR1=*ޑ=2%1Y=K/w\`bNXp") lEv^8)[Bos֮ qDi< g)ԀO_ j͉TlR=Z[Љсc@=;*Zxu3#yLq< ѺOՌ:˨2Kw#؏w$xߧlcwH٪RӯSvJK#0wѭSr .x^8(P eDXg!fB2uQ5c 9 oKPwKStQRtlTR Оbx|GA2sIuHA^$,U:9X 9@_Kr%O#QmNR̄ՙq"5-vPs]㒕ɈowOF_GfLz|] ~%ùe>ZQa\ |Yh]>a-PݘaqL7㦭C.JX IYc. gkh[ǰvTSRf` V(y>U V@Ue\i Һs_E2pn_k طQPWR[ AxnN{$P0#5o##4qFTPI}#e,7sNV-E3\Y<TJFt!3B_KrG;(dh9'3YzoaZl29R>J!Ӽ졳Rk&nI?2)Sh3 )3n]6CT]e|>>zʦ0B[{Ǔ.l㷇*a?exZS*_h,Qؾ~~B7ds׾RPjR3ܗZ?h|BHNcbΡ6&dad`5mOPWk@˽dDJwH7oHnlBsQPZ"E3NM2b(vH.OLs&`)$d08!ee-F֓XcߓVm Uy*\ݦ朙GN2sF~_/?ĭGҚ zFRa1.x+*ʂ?ŃYD^5X~<Ϫd*HYY[{j BfL74^g݀BesYҸ}Y^ǚ&ø!g^3y&" A+tRUMlMO>!UVH`@KYPP2TH _Y˺ԗ{wf:\~CMc\dzd_*)ˉuP`H.xr[B,sIv[ WqIR=a(j}Q-P:H@)JJ@fK©sY Px q0x\Gy{"0Nrʏ)wcKFV5l3Q,tZq3<9%7k/ax^e$,I‚T=2"/n(vA iŇb j95}tgO"]n,HEF{ 4cnoaĊ<N8$)l=[ 8.3E:pqj\T|ח< UNH[;sՁx Z]Hu6'קz(B`)p'akYc+_wh6ՇO/jCV/`rwqκao5t5+vϿ%rὪcg^8 EI6~G_OPJ "a%(B* _8^@/U\*!%Y-kT.1,q`d,T>_ױß&=50ߴ嗪3{*1T'r'eS-8li}B5=<uݳOh٪^c_wtK -/Q]l)Fnь{ʁgbMe'㚿] ޖW(.^5hOex&.y2!ͷ貳{+7'ؠNCYe% [T4RSZkƸU k Xt<G'}|G+5 |[nJM(5ʚH@2mZ f-P'"FFj'b)/Nu3 N'0x$Ÿ3 a(4vsۘ=? p[m(>^67( @ܐtwǮoQՙzX -cQXu-%% {wiAggL*8!-f-P [L}X(a ۶ 紓9j'?mf,ߖ(`mq a!>r +Ɠb|PC1] G j) zŴYĬsb|;.4Uj Õ%DR0)|=^7"\jp\MN:p__3-QܾX1Ih;Li:_E0r;gωKHDČ 4]0p}sJ؈w'$5.bd= Ol[pdn?ȡ2gϓjO`'`fm"S)"w ?pTaj/Mgј nRbp9 pRj1iN@҃ XA2UW&Zy_£ozc!= 0ֻl&7Ҩ#S{1ݷl<;`2d"Mt;Yqp#0.vShًn0k2nw7sEt}qG#FKG3 cL+u.]Lhy@o&nb--X͜ e:%ѦTLpyrշC'~ԩ*+q5ַ=\Ƽ@3/0 -R&dD!ra=_m+mDHx}lr[əSR)^Rx@}uK c e?6f-P z\CQ4VZ F~"'+?mB~ #cOT+݆{M b] }GFK3č> O}]<)?+U:GK:zE~6n!E%F~=yέ]^߲HB-~%8՗/ KE*$sZI[ &oj^f7F(>9IZxVzim8"2T"JCdVR'iN{M;ٸf-P]} S:T+O$"jr $eB P]AV~wk7zNoc`[\ALV%{M.ȁb{Ueh1a){*8Ͱ26Vr1cvՖyy ;%<+ ZS6O 1T#w+"6FD x@6nXjXmXgsK ˻ Eи2Яp:S^kj"bEׯKY❺ tmD:Mw=rM ᭠go5Dq^XKD& c)Kh|;rM8jTDt%7/U+};ڷq:+QW y@,E/דUE>mC)⩣u'cPvp0N߰D|E|D; NI$1$w%ϝTq"xL<3oG,kX23H4YZ2.T+_{4^\Ao#czx,mOR_GOYɵ*sS&2)2 ,'e`YV?1Z &oA^f7voĄ  IB)v`A˜̔#e^Th6Л ]Bf6.>$ tp6nQiYRub2t5L= Ox$neq[S dhe%3d='aW IRD`%)b;m] ?1Ҋ]f[f?jtռ.eE`_㦒 Ԝ~'hrct׹k܍jx"ϯaA-?Fw;?oec7r\}%~6p(:m+8.F NyҊ9lHG걸;ŷ__6; zͻz[xn'^%]\́iwα=h70 :b:jBϪ͚8#%ǐ'oF\UuuIA -4 y!x6S ]rGz-yЗVNvD蕃u2Ʀ/2,`kp`bT)+xѿ@ /xW+Gpׄ O T:7-FOk'B3;gܐ_",uzZP7zw8PQo[JЪTʟ~1`~%,F+5h<Ua4p4+t͢+g{SV?>fKL`QU=k8r [.)bAJT>=3;e ]fQU*փpU&Zr$fg4X+LB ӮY#G܁Cs=Eg?UA,P}eG4*fHmY2EEu7 XnKмZ eEؤ0v{aqkXUwP?0@3.4QfCEmP!Mdjv ,{.$ SvQ#۷okx)?E_15هoc iJT>SQΞOӿATټ_G@T I"Gh W?- :h0Xm1Js69`5ags_dk>8Z~/2#Փa<}EVi5Y=Ți""{RK7a&7O`a=KF`t'.қ~]D "ь 3}Z 2 b tG %cW o ŕs?/t-EjȰ,>4pnD[ ] ًe gꡓAs0c%*% 2['2bZ啓qpxkhecI$>CZ{ieW`v~_ _^O}mmx W}p9j8m@ײlрXF <⑬L?LZowW']itփU!.++0`UYp) ڎ,q; =vU1ahG>aDݲz6i ضxs~Ko{c%EIow]k@> /󵪦EJyx]˶rݚ/їbeoQLUeb3$tIQNw_w͞k>w?[F`旫?DXWLp{zð|wFIӔHyzhG?Viדf޾y V!X3jq y](K M%0#/be,J^w Bk;aڱ|uR׻|iOe82CK UpK|$j8No{U2ocC2Nm[U0"1 [D. #""',&0?u%<u咗gg]W. xTԩR<?̪o~F !s:E)h 1q9!J2m*4z%k荓~_IGK`QKoyJX%(FQ6Eǥa#hWv1p]4|QE׏>w{Nq|TJ_99;Xnf6$"1`{('2c[TW:s@ʊ8omHCJZ`͡M?8e,Le\ry:|qz56e)7vtJ>ۻڿ!,c{7ԅ6q?}zAs~%ۿA8*X '*i8-?9@gi }씌+,7m!6P;TSB䋇sCMQyfoYIdtwG-tNߵ2ߚ2EݲlWPr*HT2BQT$W1A NCx7k76jfh=A]&wnrx!ag>xI !ɷghI&2ŪPF2M$4 k|,%qUoݩ%v`ַW+ ޤܒV~x`z0$Ƥ8jL[bV S-|Rg⓪c}d 2kk,cSAr;?Ġx}׏֝D|xzw&g3x JUH:͸A/#>H@] ֥cf8ZzOѠU,U+”.YY_ŷo5|mM6zرm6fEl=9w[r9$JZJ-KaL!Mm>q;|gg7=BԿyҼw`-CT(ʝ.Sx; \ΜSx~('arQ*kt ճ!ATaUG4 lB Q\}Wl2gEӶIB; tMIPR1 r].N^:0%%IDZHOA+)D`zĺm <|`d+&7{m^6c JS06hJ'kו|me<m >ل@ɤj,qp"gQۭ7 zmF'vBOJՔT8"$0b\WsY5_0#Sz-h@eF A(*T8bۊ{6_^-xdMQQ];X0 RomIX/kX415Alk2s\U,t+m*Ì*Mᴻn0Pr.c?MAKφd?] *gk5tb\ΐrNEr ή1j۾bweLCg A ;8r>B5H>;(5A˸b˳z2b@@jp!al|ym$AwAW`9~;IQO#sf"vQz> _Z$j+l\XY+v6 0GȎ+$q$9FY+ͻ}t]70u3YNTDS o@)8H<'_sY[aGr܊i ^dR6=X_`з tv׿!nc6_*oR?sX@)'%h)!E}0_V _oYNC`j0!) #w 7ATRQ-;.=H/"lb4N:۰/xM ^,W~0e9ˇcjzPe4'@ ] .0}Y0K| "g3f2nB?9%)5bkowc3wƉ2|pT $y$<1?SJ\LpbQGg0w]{b?P4I8FNkZ`{TF&cLht+ä"=t\^LN&g)=NP*ů8;mL-ʱ߃o-Wb4ωz>&Sb .LKޖ].#sz PDևdz0a2='Rӂ8]RBr%@j䤃$] z_>*AjM8m4mR||oN׋^Ct+߆,]$PÐoK;A/7SRA>dg_~?AR)hm%VQQ[ oYq[c,kH#?J+D*X :%W>omg$[ ߖ`\]e;VU H kꉭR_ *KҗtwG 1ܜsrk9^CM/fĵ%׿-]W?Vʰt?VPs)q eW W*~c83j4 a Wtzp>kŝ{$oK0/&J@O%ۚ}xGyT+x0kIVG{N+'?PREz]{E܅Cr[t\#}O A:9֋Բ\x"?W q[8q?W-$p'>GTfa.\.NWG/afLdna/XCcsOI485umc.~kcyL X % #\VAڏoJX~3e <;2;&`plan/5;Ѐ{fcu٢maџ[’-LWT 4 =-UDz& qV. K#86 G"Fp4O,*{[QeMj䓯..maT'4 ѐo4/qZm2%V5 [gf/>0*"t  ӵrmr7W6 H .\6FkXb J1@,8a!5=W6}`c'&.xIj?l,$Qh$20e3;iX[ 64#WUp0KAgSlX:ю$Q촁Qp\k؄LA KyȪ",CXHiq7as2\GMP>]-@,8._'*-$RfQc L"J60 a $cz5a\@n R 4̂~*{zuA4nuQ*)YptS*ch΍tT/`TM':Bˣ/uk,Nv2d2(Ke/Y-, 60 }U_oZ0R)F"imfʪ.)K|a6+8&mNZD>h't6eWbn_vNx3_2JFAuw-yl/=g]ȄBDnOh.UJ̢b Z7 P}Sq[2#4̹-Rݔ6+x'E--Sy[Vf6 sʋ0dm[0#J% aIC08~7N~otf e닳6*Ѱ cB QTr) )U#xa46j|ؼaH48֋HtV46qY%`R`XԡJ]>0/㠒2K 5R\|:hU;I$#YvRDPۍ#2 Tf5lHCiJz]"'e>}ϕgO3ҙ=GM܅+|8vCLd(Qb$nҫi8:<ڼaTu+2" Aӆo-!UήZ!D,hz&qap\kNZ0AC[ `'Qb;qQyPyJ2v UcۚiJ)f>QXs!nT6pf ](|{ewo$J& 4 wsr9~LxY)PF)9om`>*bʌ(@08jp2.ZgWR;hGCѾ嵕E}׆kETU[Ӳ(Z,ͱ}o[ﵸW|hTE"O.EIb; U;@08]U27c7UNR->sj|9so*29TLXIYcl$FvHxb{p l?N04:vT[¦#G# fj>ҾYP\J٫}zl$PFՐv#ҢC_|OV./gmԫBܝ-qvbt(@z_BEiY?~ ЗYB(4mՅPIA53bCM"8a?Z0l#Z7 㠛.s*MuPr#`OCX]ձXU!x&v(q:q )PwW^@  j,bS\y_k/Tn h~ț:zO gm7bn\i& s+tG'<X;;T[i!C}_C֝Wk% h(Ș<%3g hXꕇ]e>rNj>$%XNj2K=?@g4X9x<E٢BJIѿs'aRR>٠}'Ù`ap4mR~m@ >H]T7@08jޝ*(mpQro0.:ꋴ3J{3β?K|b >hG`5la*%laNGƢXRb`FBJjw`bTxp;(ofb0}Ä q!˜e,hyNL6F'^wsINj{S4A^²7hߤFAV>TGWn\#븾T=}^[u旷Y-,^WkV ϔ<+,|ǯZs*Gd ;jx'@ >[t/8+hs|=+G5d]TvΦb4FP iB@­ Ii7_@: Zx&;EL{E )o^/O9\h)=-@ʍs*u(9m `ӿ~x[u^9n9L#ZKlH^bQF (!'c1'v0 Y8ʗO 0_0Ń Y1/:ʗi>{;Kyq9Pyav W4m xw~:>y7O~ͳz4!˥)o>NFY:ޫ,}~tZ}ׇMk+|]3ߧb߉dX|.rA]:{o@MⴤЄEܲ5ٱyMg?ŵDZrsbN|7G$-~;J#A~n R,Wu`Tf/ }uj=h~SSz1DRuVTaWLvs)'l|3^D.0y\ܔOg+=߂Ɵ[Ӣ Іw=){ t9E:d٧$ܩKWsu\:w3<ꪼ%(Maۉ'JZםrMղ|<~+WnF0U+d@Ku9z :qU~` >Qͯ$wy`uxN$V\\$ FwxV #gMO½KU  wuyt%L󛌉5@VS#??:jǔ]LgśRw ހƚV١H?-\o+(lFX MM2oxQ]auF }b};^Yejy~Xے)szY7Fd kҒ;=i{$(Xiff18!jb G+VT&9`T4&cف{"[c4Q۠zKUVouᇫ`[m\O^ T` ]ۮ_`{aLygeb@(; ve=ɺ%;o|C x#dURE0b._K6`WxN|"?<%'yi@6C!"c( ġR]w S@"90Ѻ ~e}ն>Y1%4WEi\Xa%{YѴsBy)ߚxwC8LQfIIʳr>PKw8FQDB9M nQL*ޑ %g0Ѳ Xh?4~ղk0)8\>| Y3S ڝڨZv[u&I?cGBU7R5t~0zϘJ!UQQ6 fiQ7It,gxg0>XC1ϪWUx)C80?qPB̸'XܘcMya`~uDNOO ldט &B5xo3n=agǃo-z4kȂA7P)WKj&v UfsWVHDjS2&yBoXfW-OIׂYyWkqZ˶/bQ J4} XёOkE H:^A]<ؾuvG}{R|DAv9>wN3 ፈ$`AwnEb |n=UkilԜۆhZ` <}* S٦|2h/j E|f}5+C{V .t`ۣ78:s9)(BOFzOߟyyh0[ۗxg^M\uѽ@rWuY>4WkdF?ݓ'z2HT/a.aW>\ Ɨ0ѯϏ>XVO:\%]q_ZO?7 -Xr_k 'BUКΦ]ї=ƂA;XbE^Y-fbG 0"/?|E(X5W,9yeRo4]/slTz >'e䛭GR7 !0B@u7Q}r3’$aߖ_}{o酮Xz<3nN?ңkf>-jdY'jii9x=aZx8_!.5~?Yb!dPRO[Ԓrl~3$5#KzulrS]]UUwדy_OAjߢbjC*/iI!˙ʏ \ ܠ{xn, N *<"-D\/i~cBg1.c!R&R/5eDDL `L'#PEzޞ4 rx$ )[X1c2b=6M Dd!-u;xwx >;=)\\ uҀcaDu4(@ńIj$fXVA;ÃNC"G.qVq̄0q  ! r>0GzTq???~{H@Q3)D†& 1M˸ȁ_)M{DQLQ栁:{xT Ns ҄D`y"D1qV{HIY2"4NV2?2-{,JÕ+/Q]"= {UoTcqWl=O1o=.z,~kKK\W~S>XlkTM)ӆT3Ń< 6SU#WLO76ցKVl%iKRVnȼuχNP7Ђy4'0q .[߼yjic‹FsiGۭ_8-l:󜔪o oG1Ih[4R5)*`K@0u\1AۍtJ7fTZF nJJť"kQPr"rDD4NEO(G%b1E0! Qdg^LK42G&#|dL >2G&/傾\З &GUni4u:|FN#_xip:W \a+sy05VAUM`)Muk9muio~md3W^6pi~~?b[|H@rȱI mFokY)Xl4az,[n? vj(S~aq4 A]FdOMKĕBpTDڿ{M\*RGԦ63Z@c;]aL i.4鯶?m_9ͩmvsIW7ܲ6m{Bie T,9KV2*>4x ,gfm˭nG32 (iX[6@M@3j ݴ'p/f~s? is',3}ߞڪȫ b$FA2'd 2{B^ =!'"G"G"p>"GdL#2}DxQ!GH FswГ|4'00VMK;Ð_ ;g~,eR[˻r.Fߤ˂^wsq QZ?dSFz<X\Ch^NhRE.8Jn^`O4 JnZgO]&\IƏ m)Qq׊9kS:vɒKG#2[=mԔ2MRy) My3Ben'n%&)yNnQ؞%$B=p$\39\=p5*ɹfb.NtJ3c/^qZt P,qoPVRRÃu^8;.?OBӔ+\~ M= $qRP-QUS׮o ~tҸt'ⷓoVhԸ/Vӹ/qDfHj(cm}OMwM8Ey〱`ViR|D:P/ұ3;{A-KqYj4 iΔks`3݉Q03Bzñ' 9E6`KvG +jJF_E$`|l}sҖG{ hͫo\ߎt vi,heZpj)RQs =׊ R;_y+52E^[A8J D)E0߅j1'hZ20"&!BB*C0Jz1c2b=6MVHKRiAjMm 9D\R;o 2r1̓9xL E'(-   0fNh - :c ! DXciFn0/2 jKc }6xT !D`Mău b6PAAuF).@Wqj|JF4L BLZjVXn Tk'35\] ]FJ*0qBOXBBpeډ@R'R:&MB׏KEɫ(7΄)*8L%`" E! E!;%"zQCj2:y-l`(3*h20O`7kBejH-97),ATK3 $jgVR %=w >B.PaUZ~YS Pflt'J9X z3mMED:A^k!O#0 `;ovI05 "Ͷh*q"_ Upl$_6ck:ZdCw֨4<ֺG|Vjwec` lwm5V47f=akIo]C5h]5B-es ^Lʮ<*`2 `{:pԥu@Pvl+I@}1ڈU&J1L8Dfvc]S. -Ao%JI ,:2Ї`l,Nkhn3*A 9`΀Qec"Э4hw3x`CƮIZa1=@ZvuOU䝓:`V ^֫EHNCo7V']{wO3?ɻ>Y.RUR]%>QeDFT!Ls@ ѨdɃwK|T،G1 @ @r>`,Ck5Kp[ShS} yzb jhhw@[Ln*ft aśUayZ:Ц pd+!.i݄Ƣh"0ӲUJ(- jPk(t+#כV TcQCf X+YrH5[Z(UvH,25ZJYp6(5;[oMvԽ"[Lo*퀄M` |Б&E]6&Ɛk9jm0e_;ϷvS]/DpuQ L#^3`.`NFԞ5`6ð?e-m]Ǵ^kIQ3yVhY#`4vfcM0q_n_ PTF֩H{8nHJxKt0T9P.Ϡ ڛ(Ht0S]S:h Z]SAv(zDEmv="ÎU|?w?`+FʶrJm( 5 \[E~G?u7(oXV 0 .m.tBkE#HcKPMAmYo=81i}05Ӎ7לC-.!P6J fx" ֚:,[@-\^cAP+Br}SpT%#p=5f4POQ,1:a%FWopLCSc6\A*îKc"`XKEw,2I:oK`$5 n V3&K:N3B[b!wEPJKiH0@5p2N-N4rh-lELjvL?@N߽y0qS-~h؄7>ZIٺz.~[]H0XA\FMZ:ҳPa uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP u6{u*lDu`mx'ʍvY u(e uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP:,a uXBP uN鯫7'$(~o>~?XvqX܅W)Jx|6em~z׿Zuω ݢFS}'V?ܸB ͟^j\|WHHW^<%yXr8K~oRh/orѦy^_9/e|ܑv{KNԦEiX:f;zSRai>m#Lu|v?9ZՋxY>e^WojvU$c ͊M>-YF_ff%QCz,X=g̻ݻY2Č ٺq i?k ^wޱ1܇aof\d7͊Ϟ!+ob"˷K@oPϔkgn2d-CBʚ^ac>g/}/y֦_ ϱK^>h}]zb4iޔ+ٗrEY81Kb%#^_ ӛ;r7Eec:m Yxs#i0y|#Z)X7m 1J +4uf8/&DTX+vAg䮰_0 5t^'* A+>h-5=瓙Pgtb/HTqtr_2 筻':_F lР%(r5jE`N;_0 vyL26)m+dO%#p4w!hFUuDv~3H+٢&[˨¢c\-V0l5ӵ!|REԭۛ=~3O2.xu@хV MkPJ#痌0yd獌˞E$r/J8:@;d [>oT%Y-QPJeKz Fἒ1v>n]'KԥhjS)V/a^ySKDfDI5JVW#p^8҈a5tx|#(KG=8ofXq! 4BAKVrhsvf8kLw+uPzV Ȳ =-a_TU Һe d՗]aN8NG:HŁTęij= Ie:y"d>'VuGkN#Lq; \.;pg+Q'h+v`劊r_2 [䫤Z EYCJr}~SOz6w!a}Va :6D0y:c%ZL{޵6rcٿ" Ӵ~vI/`fbȒ#$pٖ*dB-Y璗^@(+xRxkІN֍/"BpE 5N[}Z0^Ql4T% ) `"%J;LuJhxmAGehv dыl5JhxMv~R3daRSwuJhxErmv TBX7 "b>Ua\kj1ގ⺉br7~tޫ4mk^"9k;0J8.lN1qs$D0qKN m\D[7 gM%)=zKhx%L̀@hs5F=\6ǂZ6|AUii |PpXeaKN o[6DHm8&3|\%atz~ClU04< yၮpg) ݦ 4^g%'B"}ƭ"2dT>ЂPi;> 9ĭ8PYJsaL-( ߖ]-/ ɀ1SB+ƻW=7k .\s*U2\Nf \%~yiz/ש .'im\]6k[f)EDf,#.bs< fK֪N m/Yi jgďF0pBӂX+L^HeyZ1^9ֺ!% 1#8ģ)$bZ:%a/̂JKc̐"˂5زmRN m7{s $*)YL1喽uJhxD-_0h+jH) z*ɋ\"L-I:%bm|"F9x)'AF|3s%&SB[)&<"m!$uAE9ö>_Vw'&B2+qO#A5@Պ ns) m=ES,a9!<9ikЂַb ƕ%ZbP% U6g/*rՔ8"$c^%ˬضN /4^PbLZJr< Ü`FjkxJt-•=X?]!m=]])Mi  ]!\ ]Z.T Qj ҕ6J˔Nf♮,B QVNqʄ tT eeRӡ+k)K'nBWu>DS+g5&!"T;M'ՎhM QVu*qR^+zN<5q\E{SC8`C[JOWg{ڥ&)B TBW+Υ<=1+{{qrkקֹyw7uRШLfTE֥2Z:?"Jޏ8  v** *uB5L=]])7=]:!upIt(+'tu:teI+|ݗ-*B Ԯ+ød)+P0h:]!J{:AL +,MH&DJusWHW].릟#L' ,`|2=a|=Z5, 0.ofxvsV:gDyKc{=_M>@r\Zު{ um麦j>)ݗ@AP_DU6˾R.C5ttE8~_%8F_${HQl?QCHDRfvhq&6FU*+¨0:6bxβ(nb F~K6rзkuwww'6u*1LjrJjϤ JIMM)NWK͐Hu4nKu1]Qkiٓjp;..?.|)p<~nU q}Z-0~Slo,P.2ZgPX"$ܐ;yN 4~+ϕ,^XX2˭.RY!h$B\Ed(Νʘ#`j g]zTc(,6O1aʿ- :h5p}E9~@9|U\r@0xcN 5Trsݎ0omE0e<46?.g;C?k倷l0T_=<{X<-3Kbb>_5[?Zrq0/.)'|+ 浇Zf ~4B)fW/xx(kXPX| @fvyq9tF (PxMn3:Â`n-q /.kb~2fď")jA߀28MHO3ZLI뛖%c?IWôaI "F(Ÿ|iTf \!.d$eYkChCE{yhbWRbWn=bqLi>C|3 *AM @̞-G^+2E\z^]2 SqO-o <Y4Qd*o%JY>niuvSr7׬u %Rm^LpO4oѩWwɨU'NRVwmXcFB<&4g9Y;rcWF*4c&cJ<^x|ݾn`T>YUo,g 9AjVO;MƥZk?}ڌjK9prÚ2*+i{NzA`v`CǏGxQ6s(x,HIɢ$RId2cI7lÒ~C/'F*P ?/dr++'ĕK5K32^} t]i`efgX̌%_#z&Vnt_۳׊V7]QS<>B iDJ`KY2tp9M9ҰNޔB2c]!Z#NWN"]iI6 4`2 ]!Z`QJ ҕڔ ;B2Bve^ʞNԊBZ$CW&sR3մ+DOԥ0S4B:;| Prjzz3tk6=?J~T{3`4kZIjul!o@W]Ad7KWX2 ]!\nS+Dx Qj JT&ڏu}yl1ao?||u^3W Ш 9ɌTFUD+iGUDx?*4, +N{ZNWA)ҕ/)uµ2 Q>ztp??K0"BZBW։Ltut { tA{偍*䲧+cs:!d L ъΫ+D)MOW'HWXiBt7•4B+Di{uut,sJ%DW:I'D2̈t~fP2]M/ΩULf`\}`Z{bPs$ЕjצgJ#+Hѩ̀:]!Jnz:A\IcbHRIӛQq]9]; S߰O 3ݧl0y{M\n YF 9%I{^ȶ27˅MVs4!NF*r)HED;"ʮKWɃ*] hB-56X4CD)ҕtu`k3V&CWTJ ":]!J{:AҔAҙT 2uBtute6`+e+k]!Zyc{:*]`}\ɨ+@k "Jn{:ArY"2\2tpt(Е2jXS{djY]5C+_J1u%ЕjצMJ5蒡+D*thusz3tut9i8s+W=EWwOWRN]E܈QcL#\LlU.9hϿM&7/lgX1W8Ӌ$g?TgZ{8_!6r=n L>,3623֎$*$;ZȢ&)l%}[ ۈvMGWO:T ZYB1UrmR0eA Xtz)2=HA*>1›遲-[欣7g;|nf׋[\䛹}[dY)b΁+.{~DoB'^eI&"s*1)'|sQmel1#44D+`N\:׊ZWR=A%JJF U|KL^hUEIz+%/]UuV誢|tUQɑ^$3[ ]URO(IIW lmʈ89 \Cx;%+\ZɜЯ(9O1sZ"fZ+]UtɊҙN.jpC_Rw5%5^?]U|}t>:2gWWWN*n1p}5C/:L^zsv>N2Q S- Rv꼠iNѳlY0ޢ닟?^uU|n7~v-Pl/ol:rlF /rO$*>i?n2-c/66GxmnzCo)AYeP2vq`/{?\-rV~d+)VLMv[#d7d}^m׳-uN{5{eU;wo=}&3evHLK].H{4UC{6ic<>iVE"NЂEe{.'DT+5Wvb;VwL~;jd0MOH:_O9QZ<lU`.rzeo}1}IojI_J|S`Tx'ɬDPwMpcJ@DՓž %YImp}^pTmC9>|:夆ϰr&3$]uQxTX2̓>9*^V+>nEr{+msPt3ҚW 3w4ؖ-lg;k>nq>*Cbyx rwqCWOTt6*vЧbOZ!?+$LK hg¥f+Z;o+J7=ErhI]U\͛Z骢ڑ)#]AWzKsf 7ǟ tdW~huNE?+㭃+݃HW;:{t[+kD3tU*9t(J%VkEѧv [.C4.x6]gbÿjl)*\'[ɪ!гjE)ƬzYU:Ÿn*`͚ ׈V誢}+lS+rk*`皡+rfUE+骢v#]]i+~ ~=]UʶBWՕf1= ҕaDK WK ]U*J;HWe]Ctڵ׼h:؋*ZEC҈NTCtUvmA]=qtUQJ6gCWfKӛsI㫫^8P{?kG ӃHWsCJ4DW2*\bUE骢4z+!PtJf=7:1tJ-HW'IWdM;ǞkʚSc\`RۥJ>.0i褏^VΡ=S7$+kE*zRf'(IXiLCt$pIBW&CHW'HWJ!Lتf p-V誢=ayP5)ҕH-`*cֲUE)h+#--} ڙDVfƮV]UC g+KBꖊA&ά W43絢ftUQqV)ҕ#M6DWD;`P{Ek?E|niz{ct 0?7ґCN?v`cW]ّv5=W9]qELf=P{?ǚ %gf+!Ȭ(vˉ]cΞ恟MQh(`X%3ZQj=f̪R%ç+G\[١PF:A"M)uU!V誢nQ%JiC4DWi*co8aCڑNaζ4ś W7CWC+lTWHWӲ!`U)+Z;6@)بNRCtQWmF]aRmNMQrp@erʻz&?_\^&uqn"po޼w}{ѴWK/Uճ?;{ ?UYw "|%7U_^΢_p?4"޼[~sU>߼Erz, NkRk)mwg/yV 'ZG~0Oqv^;yH]5`wѼ=a]Cx$C{#-<~Qv̀fuʿ<%E<2xQ%&ÂJd,ʏ!p#Js|q"@fɒ R2L(!(&NŰ}a;nu6WgOwn cJdra l.w8!V^7+)͞!b-Zr!њ}Pa $C0D 4#9S,ީMz-ZtIqvuj/TLAHmb3CI+T6T3c9$\MVf9F(JI8E,[WbiaGs839{="5M 17hmbPWL&LxC(0!iF[bUJT5%b<6k/ͥ;4Dlk)s#i_&3k\D%57 k,Wچ@.,|bL$n{\ רܰ i|D'dT=O6+YVǃGk6:b_ !0: z;}mv*zHHIߣ@K:ƩCe:DRH`I$x)U) :$K,k.QZbx#+Lgfj^|&"{ & /bڧ`[D,g>sŤ/ eP4&BZ9U) L@naxLq$2c4BZh ^*vZ*AIsP7P)䤠JjWa$Hty<pL*L::N buVYh:%?C R,H8z ZL$F6L!-"< hI;ț*|"u`q_` VŢd2]<gʼnQ2âXeBFKk@k+CY娠($T)[ P$$_TřG2Y K ia.A Av98bz@R u1$E e2aZo0+!$( <*Z"u.T4E筍Eԭ!ƒ] 1  0tbY!!.A DdjC%ǀ:Sd)iQ%@ `U]y V+\C@G] 2joJ6LEw%Ra,,1vRl e R@AjT]uR1̮.%%>( EX瘴' &y!X} !ѿk#9Rk%㠄`=dDZ4,xQ>f؜:iz!l57:~١Gd:bV!9IhZ1zPT$ U!8G!Np?Vd_E)I6L1 N; pƛdpg_;<3-> " +ti{%7>8 \ȋ9޵6r\ٿB Cz?Iqd# ;zD E*$5{neq$[l٬>]usQ6D>G 6tkug;5t1E=u7XRIm!ns)5ڭIҎa<P9@1^ BٗjwXC-D&$k:!hZcQ0%-@G!k,B<)I@̔dȂJ[cd@@q@ơ",jQ̈]R@k80)QC^":HmKmA6G=ʍZ/-htR"T-EA ,[hT-) @)wC֢{ڬGŰ%6vق '+ IISL1NUAU3"?"oQ^0bp0;JQYQcDԦƢb$fyawg=uO l?ikJ’ F(Э_uF<A4D =i̦ڬUmg "&b96#;ƒf5$=< T$-8DԅK?uR@Ǖ0[ Y\3y[O]gF3%8Fh N}TQ~Ջ6Bvl\m\E2&J5&A1%ݕl[{*?8=ᇭWES|U biFbM76~ڛ&F+I`7|197i4Ў=ޏVtt5{K3.u/?mB_mf.8}7[,ry89> 'm+],bA-1 .WNk^nC絻XӃ{8 7*;:yFlyFlauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Qǃ䐌:>xic<^ƃ*PBQ%uCFeuبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6QP:QX72pj0FkP:ֻC7ب:P86Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lau֨0㐌:+)cԡb8F:@"u^QhOauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Qw|S J~{o1re#<WAWd`|E@WF7+x$#{^CW4*AWK^Zi]CWD ]ZoeLW/B]zU^,w-a-ҴO} Ϩ"ŴVzch6EЯ_]-\lR׿#/_jV}eccw^=l.ӻlbOk~,0p6vX^;,nuEs5`#n=_r_|fh~|~֋ 4^lGotzF5b0!p nFU:]J7BWD =w$Qj_Z>+] ׄg>QC@JW2]}#DWl ]\/BW6C+I`dzt 7h]]6JTy8Xuc}Xm}>vZI51QderRQبU1Jp`rGhouGꯕUu4Jv0t;PptG3]@2ѻ85 6r7%VgjwtE(fztpTYb% ?xuE(#/0J iC+B, C ]\;"AB% (q@tEO$@kPƃJ>p呐&3^h|bp?v3=J2]}KU+)Pá+P:{tE(fzt jHtQr(tEhix"ˤ+}퐟zuo>a޺(L8}Gl=Gy sُ~*Qx9sSz^?y]Or(̧xY]&:]=fb\1UCZ/ȍo' zos FKEbsfg2)LjuH7papϳzHt[lWG?zQys^Zg6԰}/OGӞp2;i$+/ n7K{uWtm"g?v϶/xv|EsനMg`D88Mʺ*۔tRfaU$;VoM+ѶҗR`^yg WxkZVGIvxeb2sp?[9-.&dtY>hH}D.)T"ThO{g~1"Яt{)-maiӴWqqgkմFq|LOK[ھGM.i^.-O7Hc(E͐N\SΗWp{M:78 O&{oɪc~CEj7Լb]ָy)ܟhq;Ohqz 0M$2k9SpN 鹏=ޏz '}@2pvZ5K|t^;Nk^nCm^;6BN~1!%2Ѣ!ߔ(XCOsICB1Ԭm.zp=Ŕ9uҚN`]a)4RvTP gwNGIvA)ievkZw۶z@g%v_!fڬnh=z*ۤ14&Җa[-vʑ泶n&Jʼn;7_.TGHWauZgc/_k/?v@3v$;%-efW튨)*֧۪ (ڤ !P\:V62*E3 16Zmu*T->ʮDQNpnv/ո%RnFqK4nWJxuq{o#$ Ww˷2$?w4Oo|Ӥ{'w;Zwu9dgݒ5 1,*ӱ- ߜ55~fA[cd<ʈ(ZV>~;aAMWEtbnJkQq7%\ 77rX5;|vʷY']4O{SRs\jGR ~5J2z_?[g;FM:v2}vw]Zp\ڬ]~ygQNB'odiZB~i2.8AQ5vgRi~W5J/'1t&z\ f$҅,h=ɀgoUGP ut.yĐ(-|>Z,Kώ$6. Nd@P]-xAڦ*k‡z7\o-1Z:^$2B'͈u :uayQD,__=/NN*iP8j=ַL}iu^}O6Ȁ_-/(лq)-v^l͒ .{,c^ЍyLs Ca;!/ESC 'J?Ύ7[O ir3k^Hnk{5ih#JY׍Xg|}#2}n叻wiJHjt%hAyZu6;o [׽l];Y]j'bZEQUj8[[r]n!;-kUfkW"qe(O`}~Ȩ GR#)7WG]r~gmSٻqmW|3ӹ ,Kg- iU {Vc24ę$P:,R!5m8giom-W\rлQA•Ѿf8SAPE1BB*RNNulr֝{ޚ]LrDьnw7m깵 ,Bz:~4@ Cy$3픙8ޛ2&-e WM?V' URwډWUX@1乡r1ibR$(>D&KybqS;⦞_V_R7 :6^35&Hqtl x64Տ,o6Dž i£븆сfl%vnV|uIS=^0ƚ}'OK[T(c 2jea" '{em}l=x[<m*Е4;FnPc]\Voxg+o %BP(t $.+2Y7nI~mU[_u߽6*rC2eUıfʺfBm}×LYUpcn!^fqPpXA Q<7.Wʼ-{mw%vD] ]\r7h)~tePuZˡ+N=&Dʔ%Dnsc]ZN6>]w2()-ҕd3#2sj7p2W<2( *ꇡ+2ГuT$ONW'zij>= ]͇R>P;HAW="Gt=3jtp1"y+NWto8lyI}4;U?M @h:ifqdU @,W$j9> Gh@eĴ2f;L5fiMM89ۙ!Tt+%re 5yA)˜ n+lטa"=nؚ,H;Qm/<} %^~Rk[9tBT&0h9r~*qS'U3muC;?;n|G>/Zuݸt|ޘj Mod{k ,X<ɜ'mWWi֮e,#WZ9ޮs![O[dQYbof]ehzi;i@$4'oىbEvVIUZ/mثwk7ҸFOA:޶V٭lVN>Uj;OABR{𾕞I{?Ԫ엋G(FX "Y{š˜+D"$jm@]Uj6wS0Lѿ7Gt#{nV9<ڬ}sR<֞;ޭIw>6  ֬ϤBCTi 0 A1( @ KR|l ȉj'*LGXd O@/! E@ŚrP`),o[Yhέ`x,92Hi(#p4I Z<"$RiL~P!e6Pr>/?Zu!W_o|n\k;V+n]C˵l Q;kN?߷˽O5EҲe׻y|;GvyQv1r\z\獃ѹUrwn;2s^}KwI#h}7߬mՃz-OC+ႡvLn[g][_ ^|vI+mMmv\5[e Ae[ZaȤYVy .l2yYsIrnk]ջGGY hi7|n^?H^މӎyETI XNv~6Ύ} og ]Oܿ^7&iڰlї1hwN;wˡy;m0u4̊$ i8'}=ˆyoe<Ί*bq[o00yqBR/NJ-k͏kQ\wN ,O#qsgl]O*UD3M|HTnT}74g+R\`*V{O lz;(]}5LhW(OU)SP/#'f&d3MKP:*j4.M[>Zvgzxjowt6[J[A}h Ilۣ?F~kÀgQ鹱LCf f/>ƙe۔DZqJap,MGVGk) g`DqHѰ6_ rK's:nZoKY$M'i|Kun2Ja~}[I{Mt쌲նY#Wghaӭ/W4]¦NC*a= 7A#" 5NTi% #Bl45%;zjo9?roKvdږ/Ujp~:cK.mg\PZd@w`xP:r/.qY/οoEkf=h/*L1jIܳܤBB yɘͲC5"uOzۓTc &:4}aJ~Οig%,-wcCWN\~ޚtYN6 S2+͢nMWnz XwfRf }{'͹e!H:n 2 ^0 yM Zu#1wYI_uawv+*7AȂ'dj[mrT9<ٯnW0njnT~6e0AO) a[E'i\>Hc,bNxw;&sY$fb܆.2=0)IbMIꆍ ڕ"^cL)>z^[/ջ\t!ؠ;eYn3W7Ai={f!.5;N>6{h- ﮗHq"ZVI}=-9ՋYFݲ&=ąc66r;pwUK gp gp gy:pĖw?6|\l͙Ze[^Yz*QDI娲z2lf2?/ Z_>fKlF9]`<SW @ӕAI/gѕ,yn J2-ϝ JF ztq0 `Lenrc]4I^"]q='Hn幩os+R]@}H]\sc]ateP"vJc#2% ]\Q^ʠ4(yAW/ (Gt%E8?+冮 ZC吁]U8tQ)Yi 6jv4̗1r)Ůđ"=m; {Il{(Io]dG:OFjWj:gZx6_! "-@nmQuP aYRENMIm1ub"9yZ> yqő | G-8gZp N-jOUi}Y[9Ho_zG-^T4# Q@WHW8]Lv[Nl. 9 vbl`Zw*uWE\uWERz( iu kB3A5 PƹZ·u+Hg QJ Jr-ERXBwPNW@WgHWJPDWؐµ+th%=`Q:GB0:DW Bt9yu(aΐ0֦3tpmg 2v#R:G M kupygFa QJꋡ+eyݩvl^z"np 02tJqbt%]$Q+Ug .;U QR 1)۲1;K \`R6杅&9U%QE 楺(nwE[]<4"3Н3DNIkõ.$)#d2 &nZY‡% 2~}qy禗pr2Ίˇ|o&O_c@{. 5NzH+3b r;则e`4+.y^{8,h?~?Ǔo~_9]QYF{^ȤT9Q=cɷnA2J:rR.5*W2LPSKT,9^0Ol(.ZO T}`"*~> rUbnnSBƍ{y9:;oy>z񺫋\]]yp~yvm?_O/l_S`˚ǫ~+BB_fӸ4@-_~t'm|mGv8۲ĭZVǬt./x}G١\KnN7`vݎhn|[ZjlE\}% }SOm5|D7q>F7s3w9N'Z/vLs(D$zBy`46Sdtr%g(Z6TrsߎF *>U`stD'SK9 Pu0T:[Uok3A9NN;Fjl7 ~ڃc!g{h-(=؜xŷ6dT6ԍ=.j~:@Lei%x3JrO)FDDKϸ4,c"pzEMM4:]*n_'{ɸPq'Ʀ"qƙsU晴2d<$ Yn(ކXdMf.,u/1ژHi ]+|5աhbl0< }ы e9q oEY!283֨zz%&wL2)IȼW_Zg|$ ap!N@bba",eZyy*U\*ȍ2FS#GKOܮX;?w _fpxZ*^Y1f42 RLIfrIY"aRA*YfRmSkziG:tDp:Ҧcmxs5$1-I::ўd2hd6\I5a. {0ٌ5~VaU~*=6ǼӅf|]fjV"UӿQ\ZUFY,עn&U.wCgs-q`}zN/xG[ьhdvl2.s~Ҿ< \8|/OT '&aVfo!#fY4/O=w|L\oB*&._gU'|^ڼ:Z;,[͆E3M[GKѵ+"a2— bV ܌i,*@:rp,f'" q[D]D,HLѣbJx$Q֛; 6DysEo^16(S tU)Ʒբ}w$#jKLn.Jvl-m&;Jcy-]s/],AͫcXC },{h8"w;dwZ~KP7!L| v{:J Gn.KMѾǰmωMDIrgфj넒>,MM& rNJXsGx sD7r|%2=TwuҬ#ߙ?oM*{:qSM heCІ0FMņC酟N4ߍFv fK2[]!|"TkK1ʕG3rF6NX)4h ”)͞~XvmKnv-"3&8ʼnR5, ̚Uc$l[~v>bvy{Ghx5͌^;:gٗf7 -ok[:(Uړ]AhkƱ3Ϸ"@g̲BPTۜĜ$*Ui;Oyy5=|PbJhT2X݌pugNSϠ#-!~6tk)aCt)΀e+th=uBFbJmG8h󠸯"缾pA*^п٨,5(|Oғ3l0V+Vn%Oc Wrrȯ%6/+)aRH!q7 ju\1\\^,{?MwOK~8wcGvg 8dAЏ~..|}frjXŊ,zoMǷz`R~X<\T-M5ljj/濔g&P 4`>?eXty~u2W%"*㢇d}m6-7j]mhE]UK UmxPuW5~)C=qp=okU&eE! ^צjYٺ{ˁ/p7=dz}kgPWy\GZY;Ъni1pq41c7LY1&c1c1pq5N%1pi88c cC`11;ďJ9ji165ʪǚ9 Jh գb % sxf_|eD;EuHnU hǵMES$b[,=P7ǝk'5}lTk?J4_b_ `:׿ר]Zˬ ?|g\''\PtY܆Ift>6OP3ֆdĬ?k(t:@Lei%x3JrO)FDDKϸ4,c"pzEMMwmm#yYV/ sv1}8vc}uPc~IIdʖ~Hl^U]_UWUs%*t"a]0!S&aJ;f2Rʃp-1Kȳ8y8c[Pc<|v 3Nop}a޷x`q/Q2p9|QjCvӇ`'|p ކiYlŕI&9P(zNor sNE~y@rW\)msmA]x.X^G*#R"%:":XafXTqV#$" (O "/cPa =Vº $v 3/8QFj/|X0Jn,WN˛TӱT6(%b"wrI;FzI|`F#iiKr.u m7J"z<)#1AZy#4 JꭴR*+xj͙HXq\vB6gm9| W<>4ɇfe(b^.F=MqzUsf7ٸ"JORyj,Sӿb)”)\}mUKbfIs\lIذTr$9^^IY>F[הƲkc[ɸrUjyVIRcU(6.[-"9XQPaU!k!i;)ָ݂/eY؟0гc%{1]򭲷O9hq->fEt5AiJ?Ū߼}6e')\0(Wxx8AGѻALwIN?SUPs IۈS91uLLvؔDl.$̔e欜.B{@0XӰ;5iatHNd.70[iίbTq}l< j}Ctfb*}T=Q1˻_M$P[<5ߚ]#g`wSk7`K<.\>'g{!%t@3E8eπq)7wJKxTPuݖlk/ 1Y%l#1p[a:pw LKPv@`Zi,rCDEKh4e==dVjTtW K\Lm-[^8BJURHwB eQ!ÝDH (m!i 9wqav9ّ}7]<ָOwhKnuspRlT|RSX 1ʵgLDeCZ'E]HGkHČ.иLǣQR\"K~_f36NU"7OFٟr^}7y"~M۪uSZ&AMB{^1Ӊ4o7qzSC8B 9W_^j6J!۠LnL )&]T]gOۥ']B]EuBݲ|ZU E}+Mw;uW波f;4xSTL400K|o3ZSNГA\FN'j~=Q;;;zÏ$@I˯(wo@#VKi?txYTrk0Eݙ:(s?^7O!D:0[߱`Rkb*O%6, 5,NllƇht"7E1V N tQvv5ډ;o:,OrVᗮ~YZh~_MS/Wu۱vXFp$ʝVZJ ;.sypIءG76!tpmNNKfiLJNڅj< TW~x<DѧPG Py4ℵ;kHE>ג)E 0ӝ5CC WBSV+8$[qbplP]ll;⡚Kudx,"LSn?+F Ü9Q [cI"kCߜL:.GOC; Yg㡘΃8nn^)\?(ݓXUG>rVRh)>t1ȓXH~Y~/yMqo0gzѤFo8~N-!Е^ͯ߿=`9` `hn)1H-h;,z2XtdNWEѓSXH>bRR% @#Ŝ(&Xt`à⏮@NAIm>Umꝲ><<6,ɗ|_xm(k\Yuf*m^Y8 ?P |:W)z RH yZC6Z~D/-#*n<@.ݰȐCƒc#_[Ua>@|w|MoWȷ v|B]!iLC- 3vje[ׯ+vXqkv]q_bzb \MeV đ 'U LLaDR-QK/̖lf{=8J`v[򿌋*C}i^]4W2:_l8S<`s-:2Άlu~t yå 3jg>LLj9+t4rۏv~m{NZy0G߳ 6+_8 v$F(? 9WaYN -eW? (x8Ii;=*f V, qEjD0,Ճ6} ZYZGˍ./\ x$ )[X1c2b@Dk45[!-~x-ѥ$ ͮO(̬C2$5AZX,_vGEwo> KCR ˽)>ިN}MoR9W;_fvVAm ]or~5. !S)W9_P2zE+br`-&x#32x5sf"H^dGQtm)]KRfs!麾YUr?l}Xyր:LҙMJP* %^F@HHLBDN}katZȅ4*TPu4H&$ZB`#6Hc!aw\,u2ɘV!!: $H[XA`Ir,aK NN:I$퐒vvXF, %A$\/ q05no?n4bIGی~"}ܮ[ :!]٪="rylmHٸ㶼Rn:ǰᙑ. !xbŃJx"4B206p%׼;F$KRb&y$ ViǬQFYJy>%(ck7g&gM%`y.]CjT}뎅zA?>79Cmm1mzܗ|M.dN I0I %J+p.&*h=\cQ y@rW\izmA]GXn HitL*" 3âҌ;$$" (O "/cPaꄟ*v:.%칅u' !-I@8g^p^1 a0X+5Xzl&u$/oFSMRMC i:S. QpU(@/h$# \K *HґC` aT=|zǑ <Ja%VZ)_  *: ˹C(XR+IɕL%fk>SZ`*|9㿏A5U\$JUOTnWS.{Ef: OͥhqfT+ >hlTIa.gUߐ= E؜e)ض>ϙ zW^ 0_bJ3K Fb6ᖅn%-0.ʳSGYԇ:$8w)6RY _OpB eQ!ÝDH (m!i 9wqav9ّ}7z<ָOwhKnusYha*>)'yytVSVKc;-d۱bu!!P,E{ȉR4gKr_+LwNu>9TIwbVWT֝9f8nn^(8EOh鰉vavL'8`nezt"g4`?{W۸忊`3@]a;H`vz 0Qg׶H;dDY,*l ͣWU)$TƂd#"ᷠ]o\ Az'O}Fwm|I,Rg2DO]"k/%%\2NiXZYh:EwsՉn=S5uB]O-bYwFhSr+? yeRNVc|$Yh[gkƒM:׸↬pP:7PKwww?η±B>:t`!8˷|ػ$op,,rb׿< SWB0mְeÄe !9\S-7RdӿΞyw?|i^jdG(( |3&~`FN[dߦ /(anlM~2uΧ0Wf6yzVWm/}J+,%C]ђ^$lT8bGUPDS= g^Y1h[7D!5dr?{PE7z& q1Ffi4uTXo'M#A_*]9e}&k,׎y,M09]HSj jG =n_*ݵ֠flDabz"[#cVy4FYnMָZ9ѱpAk]ō齻 yF Sn|nq;O) {qQ:rW?~j-[mBjtNT:B*Caj3jX$"﵌FMFSF%2ZeTܿqkm3eTcV2}֡m[Nި% 2+KgSTB|5yvjV^l6wh|7.RLD[5W6XDy־zKjod5gjW7\kqպ]Y5NV9Z]&Oc.r> &ceml\egmI gZr9AfF]_yoO£+- C!fH ^ wE%Nm .ܚ ~̅[l3 ߆9*KLqXt죿[E<Y,f!Ynu '1beP(:+P)XC|R pyå = ՞:#,RFXp4 6% P<#_֌1y#5LJ-#3Ī#J7o=w8jtp@^^]zʒϸ(|@CdychIGH$ۣy]t#`@-\7M歬]oiO$AM0x SՎ#]hDl n웮 IM ר]QQ8 ҒPNbXXm 6v$U-VJsB}XPG Vpʁ;ࣶěU`̙ڹLv5ԮCsEUq%6[B#Y` F#LxdN0+D{-"#a:0- 9ڰk`Y4,r!c0&S:$( rZPB`#6Hcаˠci@ QzHLdu4(&  Z6hڠi]jZ5h1I/B=;=^.r粙8Lޮ>!h34> qp5֯w}Ab-Fp|.ˆKKus,[D:>U#vmqo HuvFόpYc$(4V@(W)G+?6"a].7!`BLJ;f2Rʃp-a2^jF6p./.PuR8(J5gRg'aS7B/",!gۀx|^˟G?MSz>M o,>">ϥ\<$*ޢظh>^"E$<LQa6 1LXbݤ>~Y[҉3ѷSo[ӷq20vӊ`ټvj|{, KOf: _'H4?Fdn=< (O)AM˻$#_Qz(mӤ%1}43%VSB/ B(A0uf: wtseYyo%P0 Tt̹C(XҶW.*TW2M_[C)1咇 @T+Y"ۤ6&'ray7kY/ p:)TJ-*Wz9O#whղ _/-2?#aOn%--Q^#`Sib{iedI;_(**կoF%5Y+k i1uqWjVOFxmŤjdSro">D]!K{@O-09)B7c{xs)4N֨Z\$IHa+ؐӖvQ@!i=>([=uZ5tVG7VȊcĽ4^T*V*ri Yjv}tQ [:*ttoGH* FEFe!wԯ?z@nXG,rA-΍.zCz)髪`uT u@S w8Nn7yy,C" ~X&S̸,p9D#䲚>7dռ՗.IOռXm~ۼa#^|9;oS=ufk"xwĞ;YRMJUiiUM89h_z=1ȝY=0=SK᤾q}R=͂ rꨠXĘ˿Oc$'兤'Zv xoMp˒>UױwGƦl/2OhYREfNO,mɡUAff=]2Ҳn ">r`H7=Sp$Shh6I68r(O`Z934]!VIx%NiLHW>Fq;)^ʡ˳EXa?H"]EÆ} uDWwvՑ\2\v fb)U[ CT.,QO6J#٠LfL )&mf݆;L|w5kX ayj+&Z{[Y[%*_rXɼx5*Cz Y@SJx->cݻp]>ؕTʪdj0ZP ~{yvF)|Sf\4Os7.GշSd|u/xf|y\6*=헐Poyey l6-T+ÉD/l hYh7m>3ᕚ|6  :aR/0xd!R&R/SK1тFQ4`pHn v ~cqVQg5h]/@OKDịUh4BkM̨3hQ9b0XJBԀDL"^icz.e{߸3ZsuVLl)iFyZCLH$ ?w?@)U2/MΥE}Q<=\LK,|=lK*{кM'rtL÷nG;G>\]P|dQYedoKg,wU\)1\Q1\Hs̖)).WeGW=g ,o{eQ=}q)]>Ư;?[85;*vx;xZ+`wLJݭ$Ye ۡR,oBfX~-N$$ȊV[F땡 ԝo/:rv F;Hwf? [K[Z]%/*YT\%p+ǣ j+XHFu}Qm!p+؆&9z:oF#_G,V"5RNƛW:4+clUrh|Ӷd}8T\–Dz vdA(afKIK-%-R/%iuM_`ðK"@ 4 hcУ =-ba m.` MC{Pb-QH SE4`XbݧwpQtpZ:l s عVҢ1=t: J?t-/78d?EqS@maS ="ǔrb1q(}#G4A}u;%QEv|s_?UDf[aJ'i]~7P!!YFfIJ3M3'+nϻָCj{d|ҒF!@ǜyfT5kbaLM@y:h)z%m>6N]DE'Q(Žh'.D5JW`lw5lnuaOoV)+T/j- <_Lq=+>%$̜⒞A=mz=^ǧ ;14ϒК>GgMV|#X)5{NU ךU?f~i4]E2ǰdg]b?wP.RQKĩ[>@#: #z#Hek547~̱~ zn)w ΛLK9ϴLK9ϴ3yn>uY8!9!Bܶa96|{^TzzA#uJCe;Z /UJYPפRU۟^m#gh ,K\@B`ާ u ClBs-e}bt}/1L) T?^ߟ~h]nm=$fj4eӫj&0SR{LSPYm_//Z矎ݳ˴IR??:~5H60r0:Z*N[~SR=E˰ө,L5k'& p~pܓb[>s*5EX'{}VvQqz4M{ Fi=Nh`tkHzƭ>)UTv>N)2иGƿ4S"X@WёZĽ0|d ɬx-iYAh|!V_#B3ΡQ i-7~AIjk*Xʔ jUf,IB9[a G R-ډJm1oKt!]>|mubI[aʏXR}e!;r TBTþ "٪~t 걻8<:=}'ً?X}vaU`9hOuq =Oл~wK) =ty2]DVוT~/{1SaN CUqNįftPF0oP/m,4.e\Hԩ^ںXDdfL z}]b_٥ë ডp*l:u"X]m7/tff:*6Lnl$2iH" IP8hz IMycRaЯ"?^w~Q~wBR*mzZ= C,QObIC]R׭i0o0/9Ewl?>T| >j\&ZFE3ُAB087 oꨃa_% xsƯEuIy+1 WX@TQ ҠщLsZA+F+M\ҞtA&90Y&7:,htTg>%bTr95mh9Wsz4.M-ѽD#R:rr CG'{ u$ɲK I 1ǘN5ᷛiRmG@ZvhXCQRx0lLNҝ^t2v0\kwUь>6GVǣ=ؽ1&Fsv+FQBd]]-UO6?6? 'ƀ`|hyA*$+Ǒ7'NջJf)Yî2jz91Tu[4n|+x`]%i{yy@{=i} uv:=X1K o47xRuֱ|bcKiwqqų( Un7. t:kV7 Rr(m|bg07/ h;`Փe eD :Ɋ1nvs9b7g_?|v|_dgĵ\:̓`q)5k&u|t1 L",%g{ǯw5\_H]ǁ"tN o :02ϙ[3y=˹:nv;;mHJ-1&aϙ$sH@l!cicGXN`{bҗ}IzdBu8ǞXQw\A}߲I18 \ЙLNIwx\%:NwXk]HȧyygOs-bwiCݽޙɛC|GMf9׳: =#`01 ԍ*]js tu(0E{rIԉM0b8' Q+Jte84ۿ I"|eK7]h^WӖ)M0?sE{m^'0{bpKn0u% ۖ ʓvTƳ~}ccպ~"^Wѩԙ{eh;zE[#[<ݢM`è$-h꺔B]ˁP%az WUI-@q& 7ңB洕)iSjڹ;_;3d*w,i qfѝei|Ū/2nؐ{Q~Rţ(5w^; :MSY~0Mӧ$%bMjb}6uIш[^MKzX]_za*/k2-űmvYc#QXBr.'k YL.5"7nEkK-F0@Z}5I `P:6=OG[~^4s pZMYޅ-jdn+9ϯ?OnEaOɸt,*&4xofI^4P!!`0۶Jd T%ЇcxG~n(yMIj_yƐ23fnܩQ $X.&b?80! "-l]l@oGm&}LpB -lVehӨ>_jHk^P˶AG\Rs3Rx(m !"|mH_Ɛ \ne21ܡ۞zY~wFG/5QnH=IT0WـƐeF̆V Ԣ:CKl+H{4]`2ۮ+LƮSIJCǤ1ӗg6ea[7&FlnUIƱo01zQ/8gQޢ7#ZhЮOy4,l gc+>5H{5Iިf=:YFO? O+Lw2w?`vgGe%vB2H'$ΔmY:>HJ3؁|8+ʦL\r{%h]d>fR,gfSڢ9ʻg(gjd }\*E=?rlBH`SρQ(1^VVA #M`jB&]9]vwS-;Wɯ^R9 2Jv&|mlm#D1c~mο=U8A&ݲ3w,4YxUHQM |; Sq+6cm?T'm+$1J&C< n)^KEy!F#r;eGqqnsw-p/I3w%`}h+{^+X[쉹!!EY~3eJwTy KPd3 Rl|jc™^&{j<{@nSH%fv$T%+[ۖi^h+ 4iej?=[q |4gލ';Uai3[{Nq;u<x~|X fH IE`iF8 Xxe|Ԍm5V$luG>pGzfrޮL\Eas.aDRKp(Tȣ O.r@QV"kߋqhkE9y4֙ eW<6Je^Y[z_n:jt}!vH`^ ']4zzS_Qߍ[F4V;g Qpe%QSbtBzkGaBGg@S^t6v U5kaam \ucŋ'Pٍ"Q%3٧#n$sd6 \dH}bj⑅_L^5LKXzUUѦ>dRvBCYBii,/ԨrIͥMHn+ظwE}d%a]e=PcbRAGy~rsbn|wll`aҽEl R^W,AP~h W޼G]fl³Im{ꎁ shx0T* r64.[^4ۨI|W^T8'%N*ȍ>?0S-25Ϟu+oW7K ]F{[ _|q Zi/07QALqހO&F֭(ei:V{;VƸSV7 lXzĕ5*7!H0FUK[o58jI06C&L|jq?vL"˙d{~/yD> t7.9fiПwͨȩ iBF%l550X56w5^,J!0 W#Ē1 .Xgub>7b5g5gx< "{F^i=;K00!LH :cyQF4r)e;w.#=W"dLFl1};vKlco/rξ\Yo]J/joEp=fP//҄0B+'Tu hA2_`Vuy+ғquenW8`PA]0Dm.Eߣ͠-նmm?ͲC<@: ӣX 2 8qZlsG n nhZYmigK;*5nMI#MvݱxL@ { +)ٺYqLsQNCv+0M`@Tf) zLud-J{l!tH-[Y?l|Ś1j6}8F DXc]/{TlZ]fx/B@bCVUzda8ӸE8y쉈 0cX@ڴ;խJ?[ll.٪>7>^iBaԜk+f®8˴8ʖJ^ؙms)xTR{ {&:}{&`▝%lJ#IoFnZN{| 6K)ʙ@B4t|;`oG@RxoDY 7õ΄~1$@'}Q[rS>mڢy9hp-Ҳg:Od'xӲRn*(,ѻ8Zo[V\_ \6 5}GI9GKB[3 %(ʢǀgk/351^ r|'q< tJWkjy&V4=k7A>Y@a\n_ p+9N?qks /+oUռHpQbV#BiPJ8[i!Rgm4?waj<1hQ ǷF]ip}o[JP\Jݦ%"`{pfuՕ7KD^;a<ѻ~̹l۞Eq]2Ĥv; nuZB ٌ:Sl|jc™G=O3i\{44 ɾ xH<R$ā ;vBBCD *y$0" aG,dM3LO; zY\2^mFQrqDT,~WH* J=L"7`ay4@jwA-?~{9xd7M,(%*I*k޿_|zw ~NƼ{+Ttgwz'éõ@./&7iǨqѷ~SԸg5+/U6`Q|Jg\Wp$ #Ŕ_QxT:'e(3.0 'E訐)*6r\X5zԕፈͲfYx,Y, o7›eͲfYx27«^~+ʕ|?|64]*k_Y`OTaDmx'PɝT45Mɪ^t39Cv mgd/z'7sDZicZ29Kݸ2/Z;rW'Mx"~+ϵ蝤SwߞFW7f|b/6#V׎%˺1ԙlG<2Q<1;>ˡϏW=quӑa ;숀ɭJqX\cqzy >i-RKTd0<]j { ͙ ;?N;9x}E:Y`0I?lC<8U__*~Lz_Qgڛ6Wa*CCW;_MRWΥncj^xH8[׃@ ! ٍ"Ac==Ln0z^3sf:e76K ._,@ԯ}~jx|5.lS(5c{Qrdq8#X,B0 ) )ŀI2"RL 4`pHA);?$X4](ӭ *!*ҮU',7_Ew_|u:_}Yy~//SŲ_~OO?~`3/ꛗ~v?wj[I K~ֶT9-t ܾsby}$o;\UECx ^۵̶-C·nḬƇ ],}Ubо7a2[oKx蟍r΁I|OLs>™wtłD}t%90?A{ĢakkK[=7Wz,\e8~]KJ3-IqJt7-k3U/i泴@gܶr"| V$66by}qżKٲCȲS5[)7N$l%nyOk\5(*JUeYªBF%+R+O`3e&jݞ4^%WU=t5n{sL(QmG)VRaT0A[R.%JH!A{2b",S\AƃU[]|4 H_[ Dv $5^޼8,`\hl.Z~IPMeiVdFr4w|nv+1}C7wlԘ|epY5L[xil Ȃ6X, &Uyޝox]gs,؛Us);t9L}*e;ʓ?{1erRV:{ `ڟT/y}K'eLcpHBqr?h;|4+8 %\W[-uM: L1` Zm)OT3BwA@?.Ψ܁ii}ލZlVRݫA'oE'XcMazZ@xİ00Gڬb{"$tBI k*P ʂ $0Yf8r=09aXԡ ȫ 3]]C/jW4//4H%ݫ̀lڣ2%"i>[uMEm&eVòDiOҩvDJzthC |b )Ar{ܤ84דANSa$ bH +Sŀ,3 2O'_ʙAP"4+Á%")# FT8GRR"=YotL<&1YƨwnB1 2o0B ( e4ࠂZioDWWBQw9v!Ĵ1Ʊ: AS^"hT1 H&P7&4Iw4 \Q%fqf# *6FpSFfG/#tv?ѺUȘ>NY[tV͓}k r> ŗ<!veVFaIYH"k3ǔ*Ã2)r6rJ-XJ<#NAm1G~H#XXőR!RJkz-YpSAEO1C!xNaO<*0׷8Cvֳt''3g&XsG$ɚ$E I/̲{c=tTH]R$ ͱ L-0~4m#=Z*m ոdy[ [{epc80e+ehAw<7lG֏ >^Q4:+5f<,f_'*-27}܌9hT->#=-pg9\-擴I{ɱ6II 'Zr8AfCMS7PϘܗytgR,Ar0 eXC8c U c tO@ ߜ ȇlSd4Z}:=e+ݐ=6?R[q/RASCհS>Q-Xa͏3wYJӌx.Ueb&eɔ_oHR4aKO+L'|'%C6l1v8>}k֤t;q8-fYt4'ICܟ nEy+f0`fL'CboneN&n{@w!lM*aRb[ѠR{%aR*Ly Қ)RR%nIb΂watw_tW2Px g⑚8 X:ny>L3T-k7mIɏog!Cj0 C*i9Q]o%>hIW̐݁eG+?BP2bBdxӋ>?}ֽEyywUTj-k~a-IVv賩kK4J=]jiy0+IUus R74b}VtKU 6fSnޖ2:/vr3?8ʻ`EnݮJ=o5š"wIbLkTyz!}IDJI,vQ %)"`.?qsy}6"ЪwI76w5գ,Fd@kŸMbwQYdP!4-.hj-R;ܼr<0_Aڭˮ}ڼ?6qvp`j<46},a7]>FAn9J9x0H~ugN]#8_,-2{ 1m$hgص2e:Eu:6{\٤ |EWvqg*HDd@_<*t%)@yd(j98Ox{r}r*:a%ZQ+^Þ1v$M)%#I^(+XB S$s8Vl{`žWkv>{cqUdkuT{ ~~a#sj9Њ6U?&3#O= !xҀ1 +<Dhx烃QE+s}la]|0!S&aJ;f2Rʃp-a ^jz<>v_t khrq(pZOtiQLjkX|.vIA9P,S@*XL-2^^u2^]Se֒5V)1dNRafX4N+!ʣ[5HQwWy ]B/ޅBb0S&Qy2R{SeSɍ50"PI5 Kӡ.,HKXBIÔKE**(@/h$SGڒ\K W2WnC` a),z*#bܧ+i 3(J5gR#a%=&)=fc&eB_+ '0(ܥnoG,i<]Fg\x6ZI,=N1fd7o׻XnW=~;Xti8EYW-KLFfGçf28AWuIAL$!5g90i4?d+)- LC\K;t$LtR+ Ju !B3iZ?lyQejƿj2!Fl=Zd.ggEf: TV{1o9jfH]8bvx?I>{_9EL_I.Dę$J玂f;/51Y%t6?\Y'LK>)9CӐSR]jC}x:hdVjT,@"$,q0=W$?ѧq NhkbTp'"A4;cC: F[CbyGb 81¾w5zu4V7Go|NmwuAEN|rtұ1,B:C:,k(p~!fˮv >b<ၱ4F1I`)$dHh㭆Ǘ<,i[cR;ĬW逰)A29Hys^s%@ckaC%**z'?F۲9gǙʔNwU.rq]M?/*|V iojgo㷞+M~0~?7[8@CtcMR|/@?ou]~.gO"3w? TzGbk$p}1`bO-P ]=3(R3͞zq~OXX`АȜ ,) `c"3F+BB@M{(ݭP";>ofVN2C$ImҶʤ 淣:^B W+0HBB8'h8L|L( :PC~<@W1ڜ Q'o3Hh!rbY#TjNֿng߮J;|Hd|I3,HL09N־:Tc#Ϧ ON_ݙҁfγ;7QvSbGRW_aU !9Wj#8z8l :%nU Hf &`^]Ȏ?>|ڃsimo ;MլȀZ0fj9h L*/<(jGZ6rDŽ72=NٞE-8, wZ)ယ= )D%T{s$ غD@tTaޜc^M0@3bTiʌG%Q,jRUuG N+vHN;i;SSvITxꋮrx=-~5N\r=G `Q C ? dkMr+ʎ눵졔ˈ|ɛ|R7pΔt҂x#JP5eZT:ӼGc|ro$Z3-vYGNI7ߜL u:9] ǛaBʘNB$Nd {ZDSCT%Z@A:ŜvRݶ.9*'(奔>L10ʜoLJL30R#-CQ$ThD(dݜ z9Jt]or u [P|_I{f̜_&IZGFc#~.^V~!,CZeU sSVg_nrO."e5!+[ b?nc'*9TyRJZYBs89!$,ae$`|'M")IGtgx6? su٪mU=];Uq7ȝC&S :k M:IWg0 )[ݛAskԴ=L縺EͺChލykR0Lv7;;C{@Xa϶{:Lp;zGi_tb^d]c:ܮWs!Jء,~톸bFoOi*a(xd5LjVQAY^E%˨sd4 4OGh4̲멳]O'nk߸־[k:cܠʿ!o&\Sǂiϩ".DXPq*FRiEkk683LݻMnL~ix>UU2iakS v//%2̒,+|SEC^dA;n_*a^eLf-hq{Dٕ>8h/5.3 c|>\ v33Va#3ŝc-*W{v*oPR}桻ٓCenmv7ڀ"^[sI iBBn`F#(y$!X^'Rfuρa~9\O_Zۮ3ۆR6꘠\tmrmJՐID7/TZRy}O6A]`qU1oO|o/_G՗ Vʙ3V_<57cKr-I2olO !gQ#YdHr,0sC`/({֏aDG똵,|}38d PhB"c HZDS"R&4 TV(LiVk嵟!t>?N(ElRW0q9m!_gl{5- |cjאI/Iy?$댈fu]Ql\nи셺k18 K[Y)7>&z6ԣp!9gS_[y ]Y6`6]pnib G[ T0/v4 8AdzArwl6mRD_`-k(fa~Tٟψ"0%@]K4D?U.:#{rPlF:c̐uC4}.m /aofbKFmR+{ |O2#`Ly2w>Q:8|<~&}fpLn+^]b>*? 0Xq`O1.|;hu xV˥PuWbKG}Ƕ %4^@NF.ӜI 4ȉSV o]2hw(\\őGzWCȷ3]o4b=Mэ&T_;㹩uANZݛg w#csHtl 0 $+_@BZ>b<qrQ[$\%Mɔ\'-GǛBBHLkB<.hMĒ\T' >ipD$n) {\)śн/ot}rj<ș=z۴_CUzŠ:`B,`,!\]Mnр7XCⳚyzvlG񹺂Uݯ3E򮧓pΔ.0L̄D:(fIP:lENQߓT~c~>O}m0ηtzugse -&C318XK~8zPPO`0$HԸՁo. } 62 #ƘC FPH()xʹ*OOPv !-uVRUlb3m3m'UzyѾ<Mu^pA_Kf1 ]lߔA D`@(Qږm.-q<]OP$RZX" q rsT*PIX'ySB6# PRj fb`АȜ ) L`hR(P(IquNuj(d};/}ӶmGbS?kmmh4IP^IF:X9!HGeZFXW)}rZ:r-%ilVFV ?\i^֮풀K;d۫s|]h!D5Ncp0u# `T"ȉMfpR9*{sO:ga}*V|+C)/"^Q$ Sz#w@Ҋ2D:RX>Sz{ghRǙF/Ol@D:;7sSbGRW_\U !9Wj#8z8l :%nU Hf &`^]>>Dڃs1m,o ;MլȀZ0QQTm%r2K$YIMDnd17kgE˾"KB*"eJ8fϢf {I#^$8t?.$}p؆7XWcFP$aAq2#QFI{+f>6ci Nj2"j/޻'4zſ}_HCb-!f(+oi$RӶ^w5er Wd?ߤ49ī)VT~?O!>7C(N`Q%^0PAx9pZᢷAp)x& 뙱1$ˇMD&$S#FkuNURdd?'5Cw%Q(Bb$$ ',>ӛl=3͌q_Ylf"'`k.uUwuUIS0,h^7.MJl`2v9S/|TkԻ;D-Gvb1F%Q 0nX @Y%2ނV(E z)pRؤNi76+ *ՃЎA|އ#z^lo Oesb"Oo%,DK +w;;`2Q$ՏyoXn:>۩2$eQ\:ZȽ1:)bOrqZ%}ys\&PYD;bϤ )hRb MҀ)h-/Y~Ng*WQN J+iq5Siv&U 7i&Sx9tYxK!ކyqCzJ2^"D$0 ,MsQСC`_/m撩_!ڔLa*xbFFY^ Ŕ#VTY4>N1CʨA$7}}ٽ4^3 H` )5'T8i0 %K@:gB)8ըij* e10ǒ8EB!ʭ%i" 97" Yc\PVZ0*^$"2WbNsa Sz ؑkʨ^zrTN'FEkJQ_@OaW[>,Ö[yOŽ*F08N*j5Itɪ8r.sFx"Z19 $,3U(y’"i[шXGWRh9G1RK,( 6'R/u,#53Ba2V #VVښ[Bu<;"MJ_d!3xզTi!dL\ 2c>% ¥!]Bx4'S0,G8tE0/+/?#WP7!jІV-1JH蕸| GuΜx0ͫ|EBO{gv۩:)n~pqs'Η9]:=Ql[Lcp̳~..GwoQ<|@' z(X;O~yϧ^b׫ ޝ刍{>y}xh^n?`:k;ѝ)yh'~hwn^ E}5~`VW=A*p4&}QZ`t#>Ӎ9wE0~ڨC:|7:ARI~svuA4<faVá? ߿כ0  esnvC@8أuŜF[>: ?3?50G#, w#>镭ਨ[ sFAl1^_9dp{ލb=p<1^=ڍ}jzaչk EGۼ??(^ weQ<DfzvP/@o0<D;[O>yGϟ>yv~AEƝzTT $!Yˏfɺ[ +Y?O l7Z{o@ddޭb~ mE~Z=5{yS߄VȷYxs~RY mr{)^nEMP Gen G{J}NJ9&φ>IେL XzE>MFCp%1\Ur- ! !ٴdnK{*71VqP.g RG9n[pGBC8B=вn<﹝sDFg@yL~5 r+ kyA``hfɇY<:Ux+<.!îcAS^sf&ReףG¸?["+=wH^zm>og7;PAi'mUڝ❖mOD l#s^mX!K]ڱY*h3c]~@Vda88x@!͠2?GY? p/ aei0,swu=B 2ǟ=ԓ&)s#f\p[l+1GֳX8%0mR4K<4ʛeUGYw0\n/ '۽# ban}ps2mQnǖo<@e͢i3/j-/ժVh맛/&O/EbmIvhz |Ixq8)w;I4: z4JGtvN7)BAH͞%i(>|aI?Ņp ix`דoܜlVO'ϝRu7~A̋ո~@¼3j}9b. Ob(+Ɋ'Y햿ۡeBI|^~%;vwwvgnD-H?|{C=(<@E׻yvӹ]b!~|PDY{`f=^C[eփPk$g6ECAEI^Gswy2[Bex.0`` Av7y/ B# X0X?cm}?|+6b+} $>@G<; 'Mij(/v=za/{gHhyJ'j(r}gv%&0v*g$6>qٌo~ 6VncC\xssvݿP+Dg$J)F0G(d]jZ@%&.{”9l25{Sַz$X+[8O+Rֹ\ͬLŔR3xKAJ0&k(©b* .V5P[@\kkUoU'rʊ+ڔ2*P9gb[uusaf]Au]tMV]* 樫D0A@Zgȕ R*PMTW\H~U,(]u]@Z!R]Du%0.', swܳOATyZÞl4u.){#=pGs CNHKD5Q# $MtI"62}~pf8h3 m6Apf8h3 m6Apf8h3 m6Apf8h3 m6ev%Kp3ϴE"m¶a[-B!lE"m¶a[k.B(JxN "3-WuZ윊=*f 24ICݻ}j9?C븋IrO9xjޛ)6>(*-G(!'\#q:5\ bؙT)Rn ,MsHCAWdHa1:S(!xMv9$|͡O¾P@W`ImRə;%Kbe\W$XݾJ˾!tjv3S{ Fq2,z/b+U,&NqaV8 )VrPM'K5#)ƐRq"N~`(Y” }H BG3RM `a`d%&)#H/B[KD2$S Arn?9'au1Wͧނ+v}(fI-n<9\:SN]\Uy'QޙcMCc:IO֨LL6e\u)Բ/1T&$i42M2Il.o(gI{ȕ+B>iś=7`<`0AF_V1HT4\9G@&g F8{Z/PŹTn-q}UJKiQ6; r?$V9؃Q՘YMlD}5] Z9_bP\y_ty`_~8 ]?.圖]9w5g=[oy|ZzLJ\}eL9wy-tnOW !2&ak{Vk3y>i^nMUa5;p0w7 9B/Hbʊy*:Nr9F=c&Cnp sNkC+s&>s p9.z'1_/ T+oG摹E- u;e)cHTB*MSՠLz5芃YW܍vCZLGV.lEh"ĩl&qU:|= -saEqT\ E )pHTP9u*A)zSJ]:z})*d#1 %& DL!"M'47`hRT~TvQLx|o'^lYuͫ4ݚPK~;bE*p#Z`A*2-k5D ҉SU#zh GWx GUeGnB2ey7ek rEt ã_dM|[LtEɯa7oSw^^|(C$k2>|MBԃ!J+Ӛt"8 V16T.@*8jjߥV)^_uH (qڣq4tJܪ$Hf ~+rhԵ>$cByS&i ;̾NbuEWGj%9h L*/<(jJm&r#x[)(, wZG)pԚ= )D%T{Xģ}Po@c!3g`T4eBryͣ{Tg)W*>6Ac-сJ())) j G]=}G#oxGZ|].P`$Df-db|a5'VQp>$W?{&i>L"2MI1g^YY_\gn*0ԤCsMQ%sȏw"()9ӞSE11.DP6SI-jA"=&bz گ֥35gk^~.&2Q=.6v;z,vNٝ`x@7ݼJy$PM i{M%as<\KD&-qc;Dn^)%?se2s˲(N_xfu 4q7S Jw^},Ju/ڷ~sF?@WaԐp'wտ]>Ŗu؏ċQsΞ_fH_~͵Uq]O~J?-˸AimC\ YWIyd%H*Y*Uw(Y|P,AI9 Qa]knF\/jY/E#VpE ĨaF:R?P Q=YkJree;iY3O4MB7¾[݄y2Bl̮+P/kdQ(û=| 6O7-j ~*8yA D($'0g=Ae9=aNB]q|cxў+sj kFWmkFTn)c"ZDS"R&4 TBVC=AZV=GD:?'#PJ 3jc"0 M / N0ﰴ׌=-^?جؾ> (ZqK|nlĞ+sSkkٖ ՐK>3HC34cj|DA/(Ni@#̑|)<.C?2b˻V S%#UJQ姪O!*ĮvEi}וʹ`/Bv~q?ee>;GeO{^OVo.eG]~Of?5GP`\C#; /rL-..7nw.@2QVqN|~>/7vMMK7VS%?]ΪӄVD4ͥQգBl/NgЫ鿮O lЈx-fj9Jy?[ Nqd_K|_Oujxi$2͙^MJxbV- B)"?8HoT׸ '6;Ѵ<sBVW ݥ3Yp&sS>ܺ#ȭ5,"vtUC:´<++NߓT}r($XG呅Bb(]DK.} *lDTlяWJo3뙱1$CĘA'9%xp`>QESF!%b ړ[VE,(WQ;ԚAܭ["戲`Cd7dp> x JM7`:n9!T0%^] `^]\a^]ZM_]J#_2`Wf`W_)iS2mj]哱 oT)qȉxz|oP[Lӟu6U.avpyukC&hqrs;)ln*Ɍ_sp:+)'جߺ$_֌k)և<$@2m8:;L*\1Q=FT\.;a;?hYQ@@Fr ٔX:| aXk2?/4oSR^o=<1יh)I{:E+ P.qHR[,ؤ_n|Yorg8C mA;<6{N5?V!2x%J,xB $ :9psEѠ4I'EA-e}2YAGFq p zƈ IT}Pֹ@}`4_H?g9mAP{_Nhyj.Vhs\hp 7(N#69k5IlV3c/"E`zӆK) უL$Q<$" 9ԤDT܈FB1HZ n\?PH̸R ͨc&aӔCܓMZ"UP2?{0^2e G @% C 8eM4 vj)8p.S>Z;q!%7IY$Y Op)1eКQF""#VgSNbˉqsD+m=P/Q2krI㬐9n Gd)se;l2UyP$% ۘDs$(6IhD& 8K x#@8H}Co@$\]kd4;l̉oEGз֔"30C6 a1e7A}ТVx{.IQ,ʦݶօY"U=Nw;8U& G %jɭZ<ĪuUjJ/dp-M,@8քBGW!k:*Bf;uF5`keB`T^# LQ`^X4n:dZ{"XT!ThU%`}5 eoW@ԓ*XlG;5mB7BJJJ#L2b!]A2uG  QA\R}2ɮ3o%Ce:|=ƲȠU %;.j!ՠPwVrEdܠQ  @rLcl,,xč:L;"q b004KPI!0ά :P%@ `-N `I. pqT AP{ T4gB@Q"Ł.0̑EUPfԖ5ƒ#%d~GBY'@PSQzr{RQUuA"z_Bur 1r 9 VDlZ"5rP"ZJ( e5ݐJt aUh#ǻ=sA ԙy?7.ۢݵbF\*"͘u44U ULl^vNR'Dh!`V9{,ٽЅ76BАf%SdIW=$ B+X(IT@E&rZ5ȼ`|ڄLkt0XG4/! Ƞ*APP#H܎m }KEUGUS Y_"bΊf lݼ H? VED [{. O&!z*4di* XB(;KS5H!/FsP6DD ::$pYwQ@R@"TePEx(%G6CܖSBEk .IҎa<P9@1^!B {!U3ڄ`ctGJ߳yBP΀H&e#kvqAsl:LMF"h4$?<{ "Q8TDYUQõ`Qy0 !.@IE] [%VmZbMJ!eќ4MUJPD[vTjj5]ZZzrDu?2QIX@iT_tВ&7ZJmmˇzvgw|{jfaLK7h;uT$1A[&n[`3 .-zLs'QEEmCh5(U/ yHY=yh4vM Ƥʋ9ݓiu7l}QeFI*vàDy آsE6G=Tʍڪ{(-hGr%5 2TA2 5@J'2(xE7n֣bb f+Bq&SK:)Hr;XQ'y/nèIV(ƈnt00RuL` 4@ QYjЏQ3j hNML6t 9hҢ5A&EJ4/Q4T Z،P-EC9yyeA^;P-j-z5Z!@6(-|qVp*-]QZش tLօz4CS3AQ%#pTh5]QzB%aIPrmhf<[1ixxpQix!X7j̦rz4DX%d,Q(b:OȨlauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6ꜬQG 플:6u ;gOݨCب$:=凝dlauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6Q:lauبF6|5;!>ݫϞrA;A׻~ߘKkuh(|Et|E#NW+JmW|Ew^)w]uϖ\P@  I(kog(ѡ?7 ׾\oKæ͆`1@ /oϹ0'W0dėȣeyW6,q i5 yth-]?Uj!w廋Btc-;_4xKFLjHmCC8@ơ+RӨ| Q1zX%gjCAEf(!('66EYcÄ"[e'# S֙SrχRHE285!s j*t2S+=.Еҙ)Mđ Е IQWP* ]YD+>NnS+UPzttZ$a:sW y5"ABiy) xt~BSdJЊ< {+LWOKpT;FiBWKPzt*ZXLܽi+BPᘮ\⹰څGQrLW5B>*](tu$pZ[xt)]/,'DWF!d p ] (w 3]="ž_78˶~Q#ǻcr>?|AAkb}%^2:C)W]ܬ HɪWdUBkܩgUBg'U_c=x<ࠧSn|Ǖ֨S+휐 ҕq1L8 5"M=9 B+rzBt'CWת 'HW"F":] ҕJ>n[;TrWNWs ](A]dn!=ETSH+Lh=L O~73 ꛡ+]/HOryt:pxluu \'x\:zPpZt%+t]/c'0wu`&CW>GG /OW)'HW۳Oq.]Xr"sy~{]@ &{QlN3{(&Qu@C^QGo{Qۧn }=gT794C}eކ/Ӫ9֏|V _ny}^wOLIjeli0F!hF l>Ry{ -+NGl> ^umWny?Nl!z< l!zj)w7oY[z}mҏ]]>x ŷ;Zޞ^aKȶ~1jq 9:tcl/g-ڳ7l1{2^_e8˝޿O{²o-lWAd F\K9JcT]}WzNQhY=xu" M~Cp7y^ ro'p-6.^޾pwA=Y̯_/ijw7 l TrOPw-՛R)74P<p弬~6|,+jyΗ,m+~#z.]x.{.6G)/hp7#|T",lRu+ڒ*4^iC0e["1`z{ѵJo6S: oXYZͨmW(zKc:d3f wy_+4R].mNomdպrD5by?kp~7QJnaUg.ڵTƭ£UMM:tҚ1GJkmu}?MfC=zw˲/7!狷9`:njJ蜕V Lvz"ĮLl&^ZŇ18NXUMbvAέVFg]0D~x)2b6NjlQWm JXڪWo=YdD|}FMg9ŋψju|qv&.O3_iR,DA?[ٟ]ohe&\~l3{kb)?Ѻn\};~Vɲ )1Hѻ5ye%Sr@\_>uwtb>o M|/^$\뫜pyz"}Fի:- 9~[v(Azݴeȵ9Q#9j)&CRqwkZ&|@9=`Y/W7/;?[͗g۠JZ 4O3fZԢ*70}")JdQMɢ*a1 &"22Mir |' ߩ(pɀtg+E Fl& C=EpU7Pç>Se; `vqYwZ#W,XPX(^CLUTZ&N6:Ȥ6 n700D6dmqYA 89ix6mUf98N )YFEݖu#hq- S|,oo2'KwÆŎ9WI 0*l3G&ʛ,P5j7{'ZLc~[͘VYd6Fss\ݰ`i*-LDC_oE_mhם.ZP>* oCb 9脕lh@g/hgKssN.|s aXS,7NWDM!gQ#YdHr,0sSȏ`o(;7w$gw560XXQ+slwkK | 4%NeVI1S$F-S))* ƂqID{B1ĐX:o'"6FB)+X8Μ6;v(C(.(CR08ԟ<`r;]躄vv"5 l\/4ݩ_߄*"ÙG`Zag5k"˻O."gՉs(2;5eeFw1<̲M Ȑ9XFғ|C/(L !v\eG%!sIK ?x,rU5 &v:,ǦGӒ76r3m<*se a8I~IJ|X9g3 [TѯAgBio^SD޽KUXYeT~ =: FÍ]L]?ŷ8(bCl~tP,z?=/M;(zw\P';àil?wl1RD߰a솔 PrC78:')r<_ >%E`J)1[cL $F54lLJpQENJxbyN[B)% HK8}k3?uڭn! [g~8upCp]v؛¸~΁[p~;cɬ@ݫ_ѷaip\gBլ9'.yª)XXАȜ ,) L`hRP3P5}T2I'hu#nCWy[|\?U-2wHl3}FEH, rr 7ݨ5q R, *:u&=Mϡ;{!6'^CLB3ccI7TLD&$S#FkuNHUWUF#3oTn,#A$ w>fZYW\AװÖƔ4ȅj < MN`c$g5V]MsHUDED%D=b 693Ip9l9uB6{J9Vs}V&:td,3J[ۦ@5Enu9k4,ؗ~`l$;6Yߕho6Uq"Hn n$=,ĜdFb>N!aInct%$b& Ɛ1И$#5J.@m&jT,N)rmOQ4qjJ̤4 ͱNԃEA 4,gIwӵ[5;agb:WJ"&e G QZY$QG-Jb6 ߉@"YR{=p!fx::KFrKK6Tj5tIVj\ ̄S)4. 8g\I$ 5JEcie6Nq48ΪLcRrOQJG,??_^^,*SN ژ*. `&T-FRiEk .iRw頸OOs^rEw4MaJ ܪ`6KޛRhL!"9|< 1όFxI.:EKTx,H[6!KY'QId4:lq~I5Em1:'pȱJ7L]H*ݙ\R{L%] ͱKꪱ._AM Ĥp'dqB9rB2I .&Ť>AL3K*F'vㄹsp M8a)f'slc2w𞱥t\6YHΟOg|i}w`ty4>=,1bϭYN#D?}5ݓW3/90U?3[OxAĔ\=m驷L,Xsflve)cMb3qřa%?nv)cT}@߄*J<ιh:e9h3Q#5"mWT ^@ Llǻ2dH0/Ӏd}?>̀imuVXn1mE6aYX "ǝ2:p֟G( ·~N&">V#')y3V)] 4Go\$I "2mLD815?xӥ7 y+:x Xd֍;&l&E?ۢ:vsI=4Q$;.W %uV h+bAQ@ؾbA J]㹞Ƈ}ʢ2ID, "ͭGQ$(m1bb((Kˊѫ Ҩs q%|E) mL{NĈf 0Ne"Hj9VHtcm1.X:JS]CZNb~Ηs[(fe}\~4*r<sy!"9l< qjk%N(^2 hK>s3 Q9R:YfkW+qFnMZͻkzoru-!a\|j2%yؓz:v!k2vۏY?qov˯>,D,OD!9AP a|=wf Z+|ȅ=lmQD5Ncp0u#@QMfp2->MjWsxmO9rlfur;"K:+V2B6wp~n97T,ʃ=4[ibt]k]˲֐\:ȽC+r2Z"N]0O>AkϭQz0@6ͲCh%{Eύw^Xs~yȁ z~.2{KCWt\2e^u}=1k\=i3yfSZsMO~R*a( ;p0w ߛ$$h$4WՋ! DpRSG & M=QJ*^-hCr7:o{g~~On/[颐76}s$XX8DL!S"'47Hъ0wV\bxjM^߫or`a0uqRk5k4v31nG>fC V@!&IH`  Gi8F80p!N4:3-ҲM81jH 7D{T6_*endkK^I[˞OĝeNrgl{|9c'~zBU2%ce(6 򚌏"WLI-G@ei͉t"8vqg_n);Zs8{}!1Bxrq4tJܪ$"M^`]ѵq{}t-yO3d| f3Sbe4 L*/<(jJ@L6rDŽ}[ ZBXRY 1(S5{)D%T{ @XR7g؀cF@P$aAq2\p^($E=YpCXt@ڀiDU#1UxW,̾W+ݯμ~1zxů~0U@+?-g8=e/b6y KʘkWv<6nT3Vv~% jZRN>@/Knv-8 r1t[{\(*N2e(I-DͻD\Q<`eZ1Rk F+](j}ZkZccHQFuчruo}d!2-Qj)q,?2Go =0c.f(5>몭J%_|`^}b9.QZm7x?$/rX39U\$#څh 0Ne"Hj9VHtcmm¦ʭjwJ|&cRSuB{nyrRBڏe.Φ7.uien\$nN^rIS@߹mĐWk<Pcvk9J-CiQ6Q4\%0`[\ݞX(N gЀZDCdUzhPR};ls#?JbufC6"[&S !ΠM޺'RTbRWqb:ֹ]uj*5uDd`_oe_̭thl48P~(U oޞ'=F?.L'lD y:W_<.8^,vL wD2o\nC !gE9FĵX` uѵ/QvoaD佹HuGGh?py ZYһ`2pH)B)딈 Mc$J=AZVwrH,ϷJ#LgNϝV;D!Ra`!)ړ7y8Nr6AnE\%'6H_*]wrÜ )YreT-=GPϔ0.ùPpI`( ܼ!"'PWS *R%ϕ{& hEDD @bh;x{-K'> .O5ƌ" RPm 2$Ao(YX_%*" 'it*HF 6 OP$:9b5,Ig V(pN䈝ڦK cPkX3KyUg'BA{%KRI $y66jQ!v8Yӎvk #7r\eف_dBEO4qP1f_8s,Lݭ R+'dڜF$cJ`%/=v=~Dt$|~e"Ul(6{tI[TO`AggBhL1S\DؽGL[Y%7P1>&z9 wNrnS_[m 6blurR4]_\?MY}-F|/{4nrTx)U(?,#svJngږ EgL %7s|s@7(⃛<KХ7qbP}m8ZTsqt6:Byt=\ [7LhnV% i?DWnqlY8svv5P2Ӥ.rE Nr)@^AiيV#K&=3bhĴo~%tv{ǏHLē`!&l|rc)9P@*cNJ c-qC%|~DIjxi$2͙ ^aM8e+mHk>gy b{I`)"l!$Z69{ZrglHPG3 wV|He".cؔ5sIGoKMGg4[mt$hlR#}Rw=tkRұұ后!P,t=^hiz4rӉ\Nèx5i$-|,Rjd| A!^":|t\2VXqm\6CkL ޶ͣW3WL*Sh U9wn]JEEÝnNZ|@'?9Kfd]Ƥ> ZqdãctFR\[Ũ'V3L7-Ex>j0uþD :%λMU&{MItcߞwm_yJ}ؓt4D{S(sVA-m=3{9A .<ضóVZ#hIa NQp{֤)p}{B_^=,9[iN7m~O՘|ӯ:J.tBs_}ˇʞzݙ7۾I0'r?lӯ&M~9<7_g/`o|{0< >m18.`Iv%)Bӫ/}zJ>~h=v~igII[+λwh͝m_MkM;+n7KP]~C_RGC-Z_ҪR>m3m}#6?'G vWwQf_f\L?hՋy{t} nq׿8i]]a*???0:o`˟>|S\B?[t;}1U:\PzU\%M&?O׿N--Ot\T$}=Uyɜc)4-TH\UUd9 y?]\/K^cp\g$~1 \*?ܿ^t::m8Ԁ$U{!YMx?:bolL&;=O?컗_/c/ _C_|&wM{!7Q]ӻ3>-A az]ê$;uew0OG/G-?uaի|7w>v(H>qwh:TUEtKW-!3kJW5b8}dU[sr]ٲ ,f\Hiկ럴 )`g+^1݁I3QYQ|y&SpjV(TL/D5|ۍ?eM־g" ̹cxلDO?|T3/<&"{L\Bw~3iyĕՄ1OO7y~WZ$㮒i)]%)+B`Ϻ,- g9eJP1BqXtԿOvhGLQ'u8L0+F~'nA]z-Ҥ\D?ShPg7ł9 ˭.":F jLoq^cgʒ@ F/pJÃ&\R `bWC &*&)Y*RI1@`&q8wRJRR#tWL .+X܁m]CqWIZJRp5: ws~8d0{8 /s&)I^|J(°> w`1;we &)+I7UY _~^կ/듉rػ=0nWON֋;zjNښp32pFYt9/;aǁB (n ce6狽')90g œA3sa 90g œA3sa \I+%h B8dM& yBd!o7Yț,M& cK.6?7 /^cpܛpyW 2$(a!B1yL!qp̸ARC*-XPâ[F,)_9F$㑅H>HԔ1тFQ4`pHn.tuFQgxMyo]E4Fik\Llmn 5fKNcF1ZleF͝QFGR-:d,[z5F];XMD_Ir(di@hq,L1Pº0RӔnA'tƱ,Lwө_WY@0%V¾ٵ  NwMnnқlEr*0cO|ۃ(ɕ[ $89O[N~t-JASs2aX A"ʹ'NPDVPl=D:l4!G` e"-1- nYpC0D CJm\!W~W0P2δK'&ЯF̠h,EX1i $2 RjDs$rŐ ܃ 4lgv!JEަp.li0Y%U`y#WRPPx`Ga( ei7eh}.?qk{hS;8vXGS%%IQ"AWpVcPˍ:'e43ϒ[ήx Uh&03 qn&!$G#cFF4irDJCE:<[*37#0iѺ>W#uie$06sL2<(ccVjl1:Z*8-57#] X\OkLEщ3^uj#D% N[pkeƩmcu%)\2]Vo}ώ- >e@w!k *X0B`D$| M8!&ZrKp3"<'k/%%\2N,s$nWt7UPxoÙ:rԚ:z% kEmn>wІGdr>d2Xeb~H֤%+O+8GC3Grzс eGM0Dn X1\0LXaža1JPtcy|Xǵe&#QHYu2LwZ ZFL&Z܊Xq7ڐ2CVImw30K}Tgj58 )I- ++g&8W6NN Vцi2ݡAo\Lp~.b|폇Wz ŭo^v{g,v,:wB JRIVYVWrrF{I~QvRGTPf'|,RLd|G4asX1cŵr˵PX`r ~G*4&L'bq(@ZC/RrrGXIiMoK/Nrra|Re,:uqߊ&n)np0sH[]D‰uXTpXOߚi!07KE&yå `i3cĜE NVF[3}~~LgA]㿟%XP#f:] Oɖ+޻jW^iɠIBޝz86׆LrGc0`;4QUTթ:ŕ﷥8.َ-|q3TkByY v?؏Z%:BGZ((kϫ^m{OqK(A$ḫkXra Ϙv<}B"7¨'ʻ_ٚ3]J%^zFʤ^y|{?](\N= ]subm)B 0b5EΡ͒-GphҊq*N)2v~v#7~}W__-~*)@NjBHΑ<+S4>Q!ǂAV R6!Bٵ$lq~l7rɵoC?G'<-yD=U1β.p'!(fJmc"^uh\߳ցT;-(G)= "EĶI Iщu0hNͱcnjHF$AKH.8!(Q{,ŭsh:8`|fZʹiɴJ3H +1ZnJo)w-"t?ֱ6 m 8+Q_X $%*&h=_gf?[OwT' nDďwn4ke끳8~+)~`hz?F03ˇ<ﮐ79=0"ALjMNL&ʌdXGhk1k Jvy[FF'lz{g;ܛ8.6v;xjUvFhY͛7[ɛ'Ow2 lk[/{/C%-Xgܝa1OSkf̚517G̹u0jR<1).jN\ ^ZGһ@ԤmNa-rM:4.sz|[ mm Jw,vg<~0P\` ~zdcvzvW^s&G~Ӊ :_ȐQyTF%≦7sr1 Gf2㙛Ǯd-VJ3u}Kn҆΢EWOP+/jX8<y4pgcmЄ+rcYcDX977ǔ_YCo!KIf󍃽юNQ @ hg CE($A3E8ImuZ{{ gDD֌ub/ XcG(U`Bn݌W}"dH%)Rʢ)딈 j%՞ jnD:'6F pgՎseLm  WnΒqS;uv\9Gn|>0^F6X/J7Q&^2:^fQd.+wNo9fs9d;N:kJټ*XәPK5fxof_b:?Tݏ;3eӑp؋Oyۗ{s%]|q y /З)>l9amyOr࠽{+F\Eaưehn:+OG'Cb\'JfS$;1AT0@uƮ2\)"W.WR0ʕaF* #^\e23ߏ\%^`6.W+\oXVhJZ 匨n\Jruߪ\J<`ۺ*U*rh)!.W%U\=CbL*(9w$:~gtYH)N/y1ô;!A|pM+W{)Pu:"am"hai)<1 <0~P-k.Z~ܸEV JPjdp ^kfE4 xDqTUDBʸWRh*f jWBr7\UVmrf2#WO@wUpee 6rQrR3+!(U0#2\ViΌVl\e >G4w+Yp \!Zö3Qr J @TH0g2rt3ZCH|rWk ^!B]!\ADU*[]e >G2YsY!BOuU}yUF)#WrɪDM7r\o\V7\ zP΄m\Jruߪ l޵`Jhe*å*r rQ*R3+@I0:6TԬRsZ8l{XB 6,zT'?5oAlN dy0m/mZ5  4q0;c$:'-u?c!o{svQƿꯎM@7lߡ(5Laz1ޗǏW-AXeK:Z3hE$w~FGJ;WNz;}<>x[}O}mۭWwã7;l"6a]:ӠtGw^['oGד/[+'`Y5pI$)ހ%(E-Ha%LDQ\$bns|j1Srn^/rV;J}Nł;ol޽;)^"e[d|h;2 fyQpب¢5Pd_pJ}Ǝtu>uɻ:J=1_m|v޿8ћ_7tBy FћU`< ܣk4:0x &hS4;C/x^ ¹H©I Ov99/SRno+~jOqnVF}E|nrSc7wM&$A/T %Q$#NDBXʷa:z 9XTx .g\/"5@|P@Xj>Xq_|sq=#R*[a,wp+R E<OB#TOc!o_",>/vi%wDSa7޿UFqwoevL2= j~ˈ O_٥ܛOOԟ(Q]һewQ9Ps?h7ݰ}?7mT~ԕY#x=.WQ㿢Lnu;2°&N= U<D]8,p^綝%ub \<(}ye"yoϾ60Z_Ԉv%ZY'02/2i%^+5tdYRIXՖG*M$;j߸Ӵ0R{r{ ~)T&]DoNJJWyp?5C6~gu_Ǚ^"UUꀕ38MWo$߄BgkEv:?/Mτ/{t=.uo/wN5$YΕA6!¢"hHYl,Z{So1v.[ @\%$6<ť:) csdFHsVo JKyT,4dZ?k0 'zmL9ɀsz_eh[\x #_`DP6^+&fG=i?4G^DRh]i=NŬ%gvةLl^-".U"v[D4J[Dps\!u\ UQWasuQ Z#TW`aJ0ձ4\TE]iJ|Q+9BUp h4C0VWO[%g[ʨ+ ש3ъ]JjgՕ+gB \J%*Jex\tjg!UrjpieԕFuQzeQWC6lV,оC%{m5lOVC ʭպClvYWʨ+ 㪨+@+lvJĴVWG9uQWTE]a[6=tuQbVWǨ0#T=S~}EY@to LrW*hd" 2oWo޻8F=4Srz7*R_vi3I~7'MZm>~"1-??C4`J œJ۳UeK&;h{9cFn=|p&Un](-ȠoS6]ٗ=CÇ_s<D,s.wǔ?똴bo8o))y|T80>̘Poo. 1Uf[nzG$# TFS.8Wy@!w< K{0/GpF/ij,jX0K!߆Q/JA`^dqoת$ǃN`p:ol58ڑ 9Iј5tsw^G>z{M TڈЈypmY1ӝzknj͵̥w:!i}clr;cV<1h.uޚ1747|m6p9g;VH0uLfںu7aZwhIM/ȩ Q~:h:m93qrӜr` 8nG(g˯ 4 JbJt1/O:g&.-- r!Tw4:/{ĞT?`~ͽ=P\׫ۯK[y;{#DGe[ܛ Updu//ATr1?$qGy-3x!HՍ{nZXsa^792O &8O.8 |~sШԄ_.m[ܗ2k֎G\RF`20x%'? //Cg_yU$O^d>9KAMC|?0?֋(stQl,R$ևTBPKJ,r۶B'dmۣQǏCVYꀠtRYRM~'lۮtcYj!ļTWi0zn&)0v'GnYZ tCr6p"p n[ƶ /[Y\ENP <+ C[qWbnBֈvC Y-DoB"Xs3= q%y@ /mH*U]\ Sg0QpM!$ U w?evMY*XS(I3(LQ6l[7J37Vjku7*Unܗ7dxҷI1 է !9[ٟ)Hzpb J$p^,,)Vx`;`Jbk;3CtfNrUh9je5uQj')5zS^޸G[7lfWؤoeCg63cp&ZyM0Z`c垴b&'xaQ6>v"%^⮧̔Ϥ1?pYE)Hw%O"+%Jaa`+d !8S+=)Lvm3B%GA'$#)'P e4\uptF|>.YWãSp2>?<_HIm@9^n_*aկ'm6)W92yc[$sd! ;>R`gcD4gG$PyR4v]Ч..JrA/ $r>6CܪhJ/^hhwKX龅uܶ  |h0̥+'SJ&= ("wy{a}i=5=Դ Q&[0`2bzHr^4Hݐ(lae{T Ҧ]yoF*D5Ell&`FP,)n"}_LɢD۔-AT*VVd*gfLV\i幏 +i[iTVԚ3)VpBYdG W8aqgƗѵ\~~)i4)/|abt9I( Q**9-ǂ?L;-UG)Z ^zj6,dSpefp_XL.+#z5ϙK};=>zHٴtϕ~erƿo(},{;ʽE$t&/eXde<k5: J|~||8^/cx>UR-β\dqt5ޕMZȾh^ʮL/>TPwNYwf<08A'w?gL6п!~A.ӛ`lqاMJLR$&; lJDl.%Eb,u^[V@@EA;5iatH}iZjz- #ؠ:NMz}uqL%|\_/..v tbj.*:棑3iޔ?XWnұh# -@)F w)d`J#36$o`; w.6H# ɷ5]o4fܡGonfe-Mg.z;个KTo*7jaz,-aM#}xǿ:=`hce;@HŃdLa J _]+/ae,:u~e78Ŝ9s˭#:F jE(zLX'ySB&õ r B($00hpVKTHbaJ(zLq jwWaL4:ym fAənvV_^J7izUβq 'Bl"3u7ٓt Dhd1n_nN۾@^| IMR2@Qoӿ|0arroDAǭjl(V]n>'U&W_Td>ZW(#8C-&x#32x5sf"0|/l衁:rZmѵnv<[aھՀ܌8R@ VQk %I.P!`V0/2"#a:0- =P<a{r`вE.qVq2 PA3 ('RT#Ҙ>87gుǾ! #$$8XG!1z ",iP%LR#1s)k6pi䴓0*7bYQwM;_oFZ_WMW+ VXi|%rToɿF4:Q6~0z]m\m:3ozq{O=7O ?n'(:2xMP G8YV&$A h# Ώ%d JF '4f's1@2bPQ{# CcD.^eIi` #xzyVͿKt1}քd:_\lb͊cCčOMԃr1",DDe*."&ZH0<)c"ҽhoIXxdw/|4[,N]R56{=7(Wr4oalood =!"}53 f%wbi1g AMjFdLKr*ۘN殺u҅Z2k$fh_/PYs|y4|Y3pccnX3;¿c(iy]1tWՇItߢR#00ٓC%=n] ImIoT;l8\e*׿wſvxdS^x h^߱ZM-v&Ş9I0jۮ3^RLh٠z~ޓٱ n[ϘRLT:ɴ/Tj(봈>ɀk$VɮU;>݀>&w&`P K[<7H opӹO'`=Y-ʥ|(~Py5_Lm# ,%;(_S?B6j~?^ZA%Fy*s]\@VPڲl,ge)|} jJxKjAq&9 IPQ]džc+ ն|Dp($вZj>WMt{tVF[cBjGOٱcca}XtA~kiwufy4i56W6)@#u@,ta. Yo36j9 %Ho~EN(S  X m!M$i2#$ :o}@n1^y5vp?t.7v%Me6Q FnZSaݘdW]h|Emt)"BLJ}}W-C vq."pni=qJG24HzthC:"e"2pno#9xC0D !6`+1 %L =A&05}gEC%DhV,IDF4QAJpDRQxPB@Ht䳒2hW.,`(FK(F`GRP x`G TPZ+ w w ׂ~c;= !uĎ1| QAS(Xf +]V/P-ƃ4I+}]lF383qIaBˆLEpE>El&,f41qGK"̪L^Z\Y OZ9ν2G]Դ2 KʂE S D)l䘕[ J<#Nˁn,U њ82Py.RmTZdi1* J1C!xNf =^+04H|Bg` z[$̸ !UU)z`uuik] $R2 a bX@9yb:?߸MN/7Q,C" 3.GN(܎7۱6kڿ|#ؖtc 3$Yúmǰ6 #sw7zrVӘG"'&PfTetۨQ1,%! jۢC^*Qt,սwLb_T}mj*L=\,uA“1XƱSP{ɱ6$Qs-9˝R 3C6SLғAEKags-MP0vY4a#!u@5Z{1[,} 6 tVm54rozen߮BZ8`KPdC)`K)2JY:$೛^ֶdyOkc ܠAJ_=s+ ?UiEP TT ٢FD.t@<e{wfD Ene_oȉR4gKrY|9<٤0}/lm AHRVa S띱Vc&豉hj4"-l!Źߗ$-UW3_~sP*4wpOتmٴuo.e9gJ3uO=-9[35޾ܹzZ]>30Yo2ோ ns잛F{'wkۺ活1r7=xwZ̓lP螉` p O#{E_İk-ؙCB?  ;p0ķPz̻ :[m\2앑"g0+ <ƈ 9$"-rL"oFi:'i%!cFYgJ{9_)!ǻcN3m4E$Շ7HQHJ%:2xjD=2:pBFQ.;@TXt',Jf,zPΡPJ[@z6VN~׀K '^qAUvZ>M=Zqtk9/1S+Iya9ri_εWY6R5DTAm_>l liݡ3 {,Z:|ne=+d7V#:H{arv( QbHKa+-TH½e|2YmFe,l˫`g* L@$ ̡IdKAQL,FwL=|ȯZS@# '763s *V#|Q[%QtM=UsvYXCQ䯖z|q2,7G_ӹi)S%:Ό5*PIf裔x"b`_$x`11NniLkS튠@‰dMFˬNhA4X T^(oHKYΪ3C$4 R$3NۨbycBD;_Qc/?: kz.״" X.$R3!3^8D' Qԭ"=dNY&z|=yID(#DJp5^{W#JzZUG\UWTfe8?ߨ*8YvѢfѿ/;Ÿ[jAW ,Em82R^ x?_Ғ!~v3{P/rU zX'0/̬TJ gM F%5=k EcwԻ&tkv\*s"_9.$!y%ӝ1稲:$xB0)xD^< =.}oJ{~;)gHΣeUf\)aAPy1fg#†8\_蜌&*e2Xc'Y=PJ+kC)Ŭ  b5 I! a6 ¤DH<a U[k4OB@PPQOz-ymR>Z-[}5ݶv=[$\=V<Mx۴ c(WRJز4`Eɸ9j =Ya(4#7.&l;*(Lf%I.6*qiWL ;ۖɜ ڲ*dd8]@xi͸^kbBr 뎇vGUS]uPet1yO2&&N1lcEѤ$R)\BD_kGE89st-S;rMNCô J7?Oq60#U3β]J%Q3u#FMLXVܩ0{v/HÒTo$љR9pШLHP *ț1Hhuawfszu5TSaxMMy'愕i-&Ιqnx7/qo*-9܈.I4h]t~EitIs@/_FC t^:|9} rC9J\MϪDL[RR*̹RPBKU:Iާ$}%I߹}JTRחZ]g.%eKuIgUI$% 5Ƅļ?J  sauZ?7 QͶFua⻰ǖmRhkf@Ob P4|I [<7uJ0C,J&?UߎQ>]Oq'n?ꦕ.ZW-Sc[[<,WFV'5@Td@e&j\zr_[{;ޞ~75~NZZFR16y[:L }O&o?'+<$!;$N{`[*30jiB0cUN[ D2,XA8B苸H>YK1P8$7HJGD{Z !*'P`%2KRx8 eܜ* ].2[ŵ+>pf5y$7A17tJ)۫Ҭr ,I'XPrJChxJ6(6dq \Fթ9׌3e EAjDfЄ%,3%\4d<g0Pd!*={->f^˳*XR]&8-ގ; 0kp 8tU0")~Go~G%9Ռ[ɉ7e&_ Φߞ?.IO+m$9үB))C;. {̬{<%)CRR~Qd,J:tKdUeEd {WyOs6/Co1:wSdwtE6&g(u _?a0l" :%9q/g+L3ݬfH?0HUOqo%Dh'cvz~2)WN&BBKXn0C#Hhtr(~=Moa)Yg{r{D淿 wSkQJ2|FI谢l쓩L}2Q$R#_}>yQr{S3vyy= DCVc!^.eMp)Oˌ1NJd<&FiVУ_;)0rM.Sm=擝K` tuuCokӁs*R]~z `v.󴱋xO[F>UKBOlm3å"Xl;`f9*=a ))&##;Suz |qSw)9CS9z)H> q)"9HN."9HNNdEr2"9HN."9HN.d=)Err\$'ܖ"9HN."9HλȏLaj`HN."9HN."9HN."9HλVBNocbh|7zJXժ緉*6N#VOi|8D[MEo_nM:myX/*ԝ[THBw]UlWyjWi+xCl3K֠+Z C?d+2-1sPA 1*Cz Y@SJx-> γmgIk<ʦAhs5 u# t46PeO{ă|(π.zJr<ض<4= *AK۠gl`JEj~Aݪ=UN՞j$"R-ša-'Єx8 ,Yia*l0779- -AƒE ,toNa6|1L#9C9DŽRn&x92J#"B;8HH!$#A&9o+y9Mdmn.])@7/v5p3|Ș ju#tksvmq?݅S(-`Rb Wgtw6n@9.FrKp3"اL.DpCLDK9 Q etw,[Pzqp&E&`t?,Ar[[e7–-{Lz)qM Y;hPPج?1:ܨ~]h,c`rA =94s{)Ek>__tȋ tm AHRVa S띱Vc&豉hj4"e,VMK-~5e ֮hp|Kn,{6Lo{ge[Þ"޹m'wvm:rB]$ '5n?~wpw9twQݽytO{a9epgwsjm}ϋ/hvnzH#[\s.@螆i7F#4_65y{^R5ȱ?T_oQĵyH/aQ<^SLa4rɰWFT-`ZE% A2gr8F1ykyGVJsi:=V)n[EĖPƽuH rs4vϧ8kxj=vy7"I!TPw_8$}`!cVXXtUgEu(Ai3( Q|F3S%9JKL-L$" f2D FÚIjt x`L{)c`vq܍h@;Z(PE> ,Y>rNv=]%B-lp7M1 s&>O'&J|\'6~ưiWF̹ĵyf${M7Qr^wn65}Ejhv?53BFֵf`.,*]XLYSh}!ZMlءNGYKߋ({ZAlWZ3|Mm*GgF = !xbŃJT*RGº:.%6!`BL1jveZ~dA1kY5NKg)MROՖI]_nY@;tW-xH:j1I %J+p. |&p`G9"; `+hXj"ꀞ HitL*"Q* J3K`j'J'jqל) >4rz7|N+_$1וHו%ܵ֌/RILq' s%ĝ2J U0}V#KM{Cp̶qxUv?&ݛb~;.e'ĥ ">D]!A&Jmb✊!a˸n+JM/pJv;leQ!DH (l !i/k8>яOn|$nxnMvf~&v\k9xPCsy>p#+z7bݰ1РeOFO8W2%a悋h%rug.Ο,-NEf +}ʨ:Yi&\PuUod '>̣aR_u̐! a:VQcW%a}J鈏ӂy p*燹(LJ@b  jZj4,FZ~p% ~@ 0?c#t;EOUP.~ Njݠƃpsfk VkiKHd W +8uQ&B /lPqxt6 6I|At1\4rߛIugb:4seEc6mI7G GLm|_{aEI4þtJyTF~;J'zO`p !g;%uXɝݕ5uNINIi)_O1E'3_p_YkIL JSFREE6;˜ffF9X_#Soan\ސ drN{M6LͷXWGZ= um]өnYnB MOQl4 Tځ .[Ƣc\X0taE$XLjAeP@S:{<͐ J *8ʝ7\ @ZS5S1b"iZ+IP( ~}Ozdo1w}H-L׋o}ͧZ5;\ $g3*򴰂beTf2TV| :9=Rqo#u;Vݲ9'>[cʓ{|A1&u."pVۼV6PKJ27xDS6X~m0nʼ#%:nT[ƼMg j4=sXlnn89l%74GHimz{b-|6 aT |(^yj͞$ǕVPeLpp`$J W1`(h/LPdT`'x`ᰡ .)UC\_;W1 ԷEU K7&a;IEw$8 YPJ7j)RQhsM3.VzVjbeXb(E#*Kc'S^Y }}m1'Ǩy_/qcoVD6wcoaj=S@ 6Zs-BF0Fyy.cN- )$<`fBQkI - n|hAKfe 3r'4 - :c !#13ӌ<֫o2jKJE$]QRI2B@lm yOPʺ \ũ1*0)`̂;O 1RkP!byDzRF(oSƅ!#%q8NeB"a-P)BY˴"ٻ6$W<~1O C$/;X|W[;g")ؔ,.c,VTw*_O͹9*$*ҋqZG*DC*)IV\miSwAGo!CZ[%c2(mvSt-J I25ڢ-)$8:BOеR|pAziLڈ*NA+,rRO.E#OA(хk§tp}E xnJn9Xp +hljV RBv6$JkCvYvӌT"yˌwh|zL`rJa-OXݫ N6` ւ`:^vNq2Qh? QLnVˮ*i$ ~ʳǚ`.tt% G\CI!.5Zp7W3\i!. FU5 z8*IiZuȆ=UhU`ڑkj$*~N¤T)c1)d t[ ` bf@2b@d2Qh4X?UFݚxgGܤ5wD )_JJ tfM0*Q kq*9mL:v_bUPPB Dp7GRFU LI5kGRl e  vj2 R^+a4tYaݓ./[9θ4` V-fإ"n"棊1!DIJB mB=$U8^kN^х76OJw%SdHW=$V*U0 (yCs ~\XvJaRׂ ʃVsF<@E&rZ5ȼP>xmBV5 e8.FPH@ kIt )vdl^:$,:?)j@J4%a;+QqfhH)uY+F"`;>.v6{~'K =r2M3/ T"(Qw@ҧo^LZ@mD%|uj /s&!Fuu   h0Yf}Jh v% Aر, %@W@Pl5 kh'Ō6!X\[vhXN` ae+J ֘V$ʤ蹐X??HAa8QNZ{N}}ކ`ݤ6jE#VnַKC$`PrAv#jI">$#4Pb驓WK@$di_AM:COW4,坩yD]plՀ߶zQ#͋jpLu(Yj^SIxͮ_~|#L\%>Z4~{)no{׽4O2/N|e"r4;Fәq=1V)1D1=ez!ՍqRCc'H$5i,)fP5A}cK,Wo?E=|6/=j=Zx] l> eqicɳ l}#,~Jo[1$L_yq;DJ~c|}#/ǒuzL)zĤ[luPOcX?'<:eŴ5p kǣi˙>wCl5]K [0?L{{jl_fbLS'?׭-fq%guD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Q:LauD&0Qh: ꤈:R QUT:$N!)dη@Թwa|4-K7xӹ&\o:mGa뻢됟=sV@t?%jVL/2` E(G9vK0㢮/ϋ6Ķ,lkH7oa7 Sj~Zv?p~scGo*#(og3ZxqqmogEZFc~b~Sym|*ȕof蟩w(W*B7~v{/҇WgXBWu65ϖup) V)y_ Kw~Z32)(|r:s~GnXq % \|fk}z*\אo1:N,ca9J\rA7φwgq V?l@`tg[8|rq?jq?(j[__7{dd|=ǼSY՛Wv,v/7'iĤH;JI)uj| DvG>=)ư\Gcrwns&'}XG̍$ I{Hc/|aPVEئ$ڦ>;/8Ǽ/6˫[۟3He~6PϮ(h{}pL}nH}"ˍӧ_t_>œmgZI) ?ΡݛÉgs@PԒ8Ntd{߀_nQ(*]TR|ZU.|zX+lnw9Yܗ wO7ʟj[;{gi5#}fyi4c0aѾ({n^9A?[T7[[ϣn,+Cigq^U/Jy_"vax`],p./\*~(DUJn|̌&w?a"? xvr?jX~WG|'Q՗ޟ0 m^vc.OS_x|7Gj 9k+SrNUVHF:hSifK u-G3Z_l캮8hػM mrM69&xomrM69&xomrM69&xomrM69&xomrM69&xomrM69&vOiMp2 >mr&-n2:1(:_c1rqNO_NBGjB!6x`ʢ*Q[p:;䎃cp3'}aΑr)C=~|`]4uyq:8ذ&'k#g8$p&3 ILg8$p&3 ILg8$p&3 ILg8$p&3 ILg8$p&3 ILg8$p&3 hH;|~S'r}~YVٚ 1ZSc?κ1N$GYG9 g{yB +eN&\֜J"i:pER WJ + שN]AZ%+RzWdE^KOrO?nsl| I*}|\B|hټ,..J4jZ<<]}B4۱YwL{>| No~{P>d\p;?nszh~]cƷ7uzk LsuDor|urq6ۜd0>5z{n.޵6$E`C"9\!",i+%({j%deg0fY_Wꪥ,6wpENU`>ء]k5l Le󐪓 n/EW PSu<:`4KAf6"p2Dt<(jJ@L6r)c{[gz+T;-F ƔyEe':%T{ @ I6ZGem{cX^i=zia'*誹j;UzUbu T+e >%5Y?:M[¦)t|ُ8DfE21#Y1"8%.UT~e/YZ}B $w{cA1O,q=w"J?9/0G>/`|fP俶b*c]7oEN"׻K\;X7(|4[,۵̦g7ŸWܡ{;wvw#`tY7g. 7\}x/rwͽЂ`hE v,wdv{ՃA0c,o>zuav)7gԮi 4Avdmqu] TLM\Zxp[oo4BYH*.ʥW11HP;FT{fbOmNKG5ĚmfC=-T/{VApxsJQb&yۇ7ZeϖnҚ΃肝XU҆QFʇbQEthᤃ#9脍h>i@g%ߢ9irBB1\*Kr,u›-3HL!OA0%(K+_Ȼ"'GDbG4{}D2nv ~KM.s^xh݅cRo8vqiJFLy0TJ O1baœbyVN¾5;3ӻ9d.B{ќ=Xf#ؿR9"CD@c]q:#wZѩ1ciwzg8j<[yܚ2pH)B)0"R&4 TV(LiV[_HJt>N(ElRW0q9mGӣiIգM]%̦T~H"̗A>,`lV%Un4ղUnWU\ _Ҭ1y=T/9or0ыկx8%0uyO.WWC슆E ŸX_b.+7PVCk[ hUʽ4ŠET?56LQW, I%7i_%1?HS]ӋR C [֍.ۋ86>WK-u1\D/:6WqV FZ^&<"MmI%_fliiN΀:U;ϓ_4z~73d[ r_-\Jqػu^%yd ]3E0A9 68׭Fޢ#RIa,/k0#xq:\N gR~B^JX|4& 1acK]y@V˥PuWbKG}K_[6Ts\8䝋^ Ls&%HWXӨ"'NY%u1w<-sgpp/YGo~<׷L&6|;|gMIC5nS<*$:Μ_}$|#o+KK,C瘆dhd*H圭Jo@BZNB'K#α@.jk#n8k:$Ϟ`"sLȇcM=DFeZ1wAl"&$1AbX'-($֭94ⱍ2lXsT=_œlο^jwP:.(A .8q1l^sVfpIm{ջf2:iy\ =wm3@KJ .7|u] VDcW[iEOi>x HU\} v Nc!n{m ru%h!H?;zsŃb=9E:O`%_kKh{C zjl{"L'ef:KT׸χbH y\tnUHȵE3(QrRIч"EOn=P!FGτV()5@{884$2gBޛuJDĂ)1Zz* z*q/ۤW(ʔW&po9^e<8.쫞uuX3qINK~_*-阢G?պ|YO?|L&?|shÿeq {hn_2\kOVxKQeԸXU\=廀J[XᬸWґOÊx}\S8P8"X6!08הnIVg5ɫip6[]Ʊ񹆃C?oPu>Tj /jyvP)Lfq7n u]9 ;-W // |5|zjs>m|W&0).K#N9v$լ- JLhs,u8P̒ OT\EūDhD_Ws2&gƀ.5x O'O9 ^'醟LvrW#(wvLõ}? :1Қn+N:疀!:J%&l$"&2d3DV>Xy-a .(唗Z)I9Yơ#cѸPhwJ?v{YnvÂ}Jv`IM6rl8f†@6x-21'q<@`H9s h;Ap$b& @cȘDhL  %l 6re |u3eړ Gr,5MR3=MBsta㬳._vg2QL6@OAڽŘ4s.y0 ^2]wXT3OT`&]JpI98(D&iOV .K+#QqV f:`3C̴(ƣ K͂0&9 kbAS8I. zzu"c˩r)o D\L}BNR|'fRl޸I "2mLD81ã;xù3 9ctqWm/~ *GHV HcAcA 7Z zUQՏyڶ׵NdE rt$ .I+&  SJ0LMGm!gȱR/ۍy=gwmL&d'Π}Cd]K3@"Y#G@ b,Yydc{2E4!)$JOS: M6h6 ޚ35-1)A<[B[4͚ Z@3與U6k5E w[jFTDE"!dލ//ǣ]@l,<<ONIKQf(r7g'w[yѠhw gTzBLSRjMT ܉tHKߊIǫKFSz.H3|?LT3i8ccJ6iq⾝bbĥבO)_.M|L'?-$ș׫nC"ؖZmZ`>|.O~*ܰs{n<5`Dr:`Y[FH(򥖜N)bxso$5l#K0֖&#p5*Cz YAVz\TkXes']o kjlEzH:'bݼfuo5IvD9XK`MsVJs+"YP4-]UiIɆ#u 1Ü%Q [ZcI"kCeWپԱk\rr*W=CεUZS۸vmݲr-o3Y'&k[҅x'CVAWs:I$W{t cvjy3%bǖ U`hi)1H)h3;vׁw29fvGOl,]'`cu.|{-kK V|nV_řs !bm!ry>W='mԐ JW돊=meg !3dvΐ2;Cfg !3dv|Ef?P@!P=l`M6>gl` 쏚D5el`c}6Ƭ>g0gWi`"O(AڇȭaR9K +-Ds1yOH|[`Ze2(:o;cƌL@{-#chnDsns]56 Y͈hPZPE&]n?9 df [C}kuVpYd:MoKLsi]'!mȍMRij]ye&aϧZ?]Ltoi.tTlJosTmtVf3Ѭ+%,[iݽM{+oP\07ݜϻ<-ۅ~m[*n!,)9J\usehvn|΄}?kCzy_ aCaFneŝ袙ҰTr$9]o9uه<X5%0<16"j\oKNb@b ìX]_k  }C$yW\n_}b9KOB w]lìqS4UWƅao͟Vr}s=LS!D ΋鰚bx;/+/LU/W <R0f57f8P8mEځ+:[zDi{sJqm} ɪ[ԑ*S@pD0%"WB aNMUK;=^.{J`LPiB!th."Oeqp2lI|$Pj(4^BgUg |$/s0Yl #삘u3{vt}N5z^WKCdS".ZEpVKR&x8"EHX2/}nme 'iNhKbTp'"A4;cC: :[CbyGOGրࡺ]o2ۡ53{Wulgi2xnBwe-RM ([2dHc9dhߧoL nR1#,RFXp4LB%`A뷌֯I /4if_.?=fq^U˚gd@ffsI]V3bcWZ5"DzmzВ7m iȞV>渄*FY0ͳ!rK[y[6H%.|y #Z=@ ? 1ErR.Fp\;U[\,vp5ܴ[fܤ*3Y3@mŞhV񻁷1Tf2-CsErܾ/U[S#`'}=}#׭qλhP@XTOޞ6Nz}|ʜ`-|Uq?x-M6m jrƽQuVWJ@%2AQ#f(ϩX/5sģcm^E:z㐧JY:[\)aufܞex]]|Ygc bTH]R HbAZbA "]^OYLgk׽ Iz">beHDcB0b %<$>0TRzOMޟWf8r#wx4aZ~ϣVR<<*5_ 2(PޒYٴ= eN+]"\HVoԖ T]p+Eп]Z&jay)*x/Ⱦ釬Ҥ7bI2hT:;̀ͼ!`'PqǓtӼSTN{VύpfL) TK:2fIjjA_L@}\r",^>W#}}8czG˘" _U{̊Kmeh6ꥲ8UFYsߟP+R36>)Ë5T j"ˣ\!Ū&juURA]Auťb9G\J*r  +#RW@ǣhP1:tuzJ &8]+7hbʿV?_}rS 8\y3qf9"6Zp=?јR6 $3C1)>r.ڀQDP@1=g c A},=z(LC@Oq:5Pgk'jK5zMEv)>smߝRW7&fa= bT,wUmgr#땑"g$meLsc((iC1=pwL.ܝ<#t-:Gh?khX)Ql=N; CDl ha[0) /xH-wq=[ʮ Z}* 'sVzo52tw d{Ct{ ͠TIAYIRz,1Z~J9,սkU糱\2];; i4RSFs͕C("rBޓsܹVY?Kn`3JtrJt{R>3i8HzɘfB긐J&Βg[ը^3Vy~wnA^8p*1d&@Kk4'"@y.f"Ar`^xoU_BܠM>Qݷp*Ĕ϶00|*ozRx2mt-JU8{w-w!,|1c%;<@rFϵ,wJÃ#6K^vB9kGe'(=z0 SvmOcؔ bCz ?g<Xe)ٖATLܩ>\' c_.Mڐl2LKeZ,{tRI놸m꫹ur5NZ&7q;;mSn-ۼ_$W;㩆VS5Q-hglFD{iYߓm߷o| W_x1tHmǰ{U=deTKj x:i y;A#C?>e2yͭݻ6_|xw=r[{˻Isҭ,ׅ֜l%}R k.?ٴkI7ym^҃[5> ]Zފ88qAB/uҽB -G>PL."%9F1c#Hn.?p4qYZE x̺9kHpV(Mոq>^\$]v&.k>Xo"X;N6N!.=|szK{KOD|Z|mS},\@UvL%v8|7];?DYe rsr1'md]@e4˟ a*"uRvRVcK1< A2sͳ$E^Yxa_d+^~hZX$SGԾa&pޝȞm:.t.F1hCrkK{ZPzńT!cw[Xo`D=tk󱺽#Rf-ca{%Omu=T-ZS[ɦ3h㻇+E&uX`2cs1nţy_b}}=}jEQŰ?iᤇ7 MFᄉ-Mpm Aorκ>.Cj/_$?!Շ,W1 $/8⵲2aDD&p1jnDh:vF^¾6j#g]h·Ra(Xebaͫ/7D39Ss 8TSRzyzG8H~:Ī~OxOLgF = !xbŃ`y +РS(3  ÕtYEº:.%6!`BL1jveZqysmz0U$Ts(20UA-IWlInz-2-A߀iRX IG-&sDi%0p h=TY|JDƂf90^ M5dyHFǤ/(fEwZI%HD0 AP$@~E <+%C^Ih4j^[X?QQl1Lb9eGe`o@S^9/&n0M/eC` `Ba%" %d!p- u4s-9T La>fPRoRYSkΤNƒTM7V]!}3ç^框?8s>-'>ZUѤH Q]Vq̋$c(On9l1M= ?d.t? wUP^5O,%C2˷gW(׳jnrQ4,(IU?.=>M O46K~MeY>>1ơiVƦG{3jYH!վr67wX/YRq0ЋO ):`L}gSY(b"b\.Q[3(K٥)y6C \d#dk^JϘbZ%9 ~N9.6̡`lq؇uNLRu43xTS YYd!̔lQ2orq~UT`@uhUKQW?M5^?XW˿pDzE82-H>;Bo "o䗀qێGpLRIagf#ˊ<%t&]'LKbt޵q$ۿ2~ 0Ia %R!)r~g8I%dRRUէzNY)|3.Gĥ섈Kc$xD t#bVQe Etyw'D: .}jmu=Hjxi$ne(^͢@Jxb{>[xqǗs}7TG|7׷M&|;gNVCvߜI)>($⡧hO K• m1V +1 kɰ6P_aI;n{eܴ.O8_o#On_/fNqCBY{rѣk7| DΑ_8z GlL"5`DR)l ޾ _FQK4Ub;cbw?ȟFlX|)sZ8o0.ۛmB'GB١5T y>ˏ=""hai)<1ԉ <0.Yۂux!at\`RС;r*蔨ăCtR(P(t[ |dyOzrm*|^(k6mk H "YCݮФ&e-T l @iq-rQ*U ?`VqA1jm8{բtx}ґ]ZW{*F D;MM.x뜡DB0*Q&g F8mffXO?rpeQy ֓Rʅz 'D"C!)(rZ(-Gp7q*N)}un[ju[%79yP2S! { NM݆-.P(&2ӝ7>Ϊ@͓jI'Tܾ,ZDѿL(Α|<.SV!zC 8l :%*I% c.[yѮ{^ !ڛQi뉟{AmF_N'b5O=`FslT^3A_Mb`"|Ll^@e%!N q3 PB$^ȴ)UGemv@XѱgcFSBQ&IEyPRV$׀G=Y[E>tlsѴiESӎZA)X5&j'?>:{ u0vx^b!B8P>FC#ѰNA_WhQiKoUj6Ѡӯ.;⓪Eϊz"zʃle1G kWۡ'jT#(.խ ^~b0(xybcHt7zb.N ?K ~o娑D%/h]>ڰN4UnƧߖLgvnXh_S7xK64s6V 38p4͈ +6aA^+N͂-+ۊj3Ml/Nj|Je4&!ڈYdLb E/Zx5PBkdm#1 &J#E}:G{EKJx _<2΅HzIψO9?LIF&Ԟߝ";[,& 5Sßo# :h_= kc["{r&OC0uDw$3$gjk)[[JMɵ:RS}y5%2)F{*%'`C8Cl @KPD> E~krK>W :rB cE0*'So;;ע =$&~"rF0P`jGS5[ )ޯ3-ƨ&Q&$.pq7Rinw4$b!.R M*hGk(Ay .CU,I+sgK Q)C%z>>b"Eqm:  Ͷ/1ȅy0b)oLܭzkL> fRg:EKb,آR%O+{g==FӴߚ;ZzPCuvH_SY{8~!CG]+{JA+ɿ.S:x$:X~7B>^~5 ۟-P%)u3I8,KHYv=77?!ⅼlhӅn-F3kjmn۞e<l;wSh3xZct3]{ܷj~2a龠[nMpwo֒^vh*lzT9gyBiGyW^w (Tj: u$m^*rX\bv^ԝ)ՉA[$Ώγf*mvx4/wrzzې }#oo\BO(s6,ٗj4}g`tq~{fsYJ_wz:8Kk\@Їu3#H98&\}}lC5|zq7{%)dќL1 ?[kWo]L"ιZk׎3[KJ¿AwGWnj1; qC 40b hf Jq&Τ&rT@ dZ_ݾ:r͆WL9MjհAm톹o7>j6a~). y6dYpٔ5!mPlSyKnہ5Jrs42 qD-sn>AwC;onPdFnz5Bh A5AeA)/O Ss`(ovV}^7 UmR$v]9C"e)P5dNK탕|a t|lP\;oI'38G'x^<3ٖ~ǣg;]Lgnqƛ]pڵMu}aqx;YO^a7ǧ7,5nƟ[mNxI[gy e=mzo=rrGU"zx6ﺭm팛,"v4c3,߷4%JLfNשfu3g*+Ҙ0є{)8B'^ g8^2:.XVXS|*6S֞s)kesr<1T*bD ;e9 Lz@h)": a&$K"ZǬrRG-9\Z*E%>|-+\ZJ&}X,Q*ɝeJ+gI1q,k.Irmhg뢝u/m1u,-eݛ ,*}x[iE}kiC1}0܅Ik҇-n'uD%)h2&zGJk i~+sw.5mpxaeMm&B_)Ì~N.EǸ.'q)10 .fdrH8+ʤMUúx>#]2Up;oB!A@31b"iO5$>S~ЪŽ=e퓎>fFpoӟoG3ri>3L]7zm UgGQc0ǒ+oJ0.UЬ$TqɰR̼remh0r@ۂw{-,0~I;QjrC7r2^~1܄Iq( GbFDA@8Sk -,]QQ8I(J1,,VJV}unoAp\c9f'{./Ry+^$"I//7uQKz+*-`֢4=.bBpiٹQ[M*Ffd j0D?\ѵT n&3;ALwXz\쵾f8nLhLK7 j-$ TH'Kґ0хT}=˾!E.qVq̀`3 HAx9AQNcz䀓iL[{C #$$8XG!1duA'A9t0I̥>01ŜvH]4l_=Bυ+q_uZO>aT+SHI,vQ ȸ7pS r >)Z]+HwBaA4nXwwil!1Ԭ-Fɽ/Fg*hUz%9BN>]r_/~_ͥg3͕*8x̎ۜk5wk;ca5ĝ,&,qstuIgt wЪߥu,IЂ#+~u( *57Z\ Ygvʠ82(O eLc0]Wn'mzK6^wodO޶1.a9hΣ9r)<'>pSօ/G#o Mh-zͭJʓIg8@llbp SJ55F>ǡyQ}vNTiYFN2ڗ}#̢:YT8=M07py oeLXʖ˚ֻ+AhgKA{rN!_B.'+7Y7^gFzQ8 *K.Ԛ2ROhL:eƧA>DPj_dKΗެ/}bȉٵWJhl1 ZYK0X`"k"QKQE57V"FMsFV;|/i_kG 73Ç תn"9XahH)3L& }G|!/ԘR004_}[By ;,V{Pwt}3MuD{%3#H7^#z.^[svkǶwɖ醭E^+{oRK=|5~zt]rktb9RX.`;ޥH,Zh/9 !$цp8Hq &zΩ( ΂flbÆ Tg: HitL*rr̰4N+D@Ҵ5<ƽAޅQj NVvjbD!D<2@s'H텏OW jg6#>?LOӅˏQPLtE!dSp}Sx7 T; J%Gu]:%'==Θ?NMq=-s#0`:D(ٗ0)_jc1]-6aָO4UƅM綒B0 ` Ut`W$඘ )6?ϋr[|vqSqo"|{6f-v#TM'@pT;o\|4r q7ո=`v?,;"wӁҙ2sOL$ F,Ĉ! c;4m*柣OQOm;M{^M1StO:BU4O7{}NL˥=?B55Ӹw9ZC-ݍپy†3DH-kђq.VDg-eTOSmޟЩt4\&a x$ )[X1c2bDk45[!--|]k"D.Ɣ"Q R?yPAJi"G5Č"q06.0$5I:X(vGEwo> n$*Ű[)r=9!.-|z>-^xGYQ;+jjJFkQJH1Q!8 BQ[M*Ffd j0D?\<̎oӝTDraJ3$@ ZȾQk %i\B:`^eCJGt`ZD"ZY 8BXBg 68S̀;$ D9Visd,pRp a+bccocZad@Pg($^2N14(&:322F@E5e$v*rg]9{s.\.BdV+T+SHI,vQ ȸ7pS r >)K;uaYhkU_٘NjSb|H>Ū:˅xsWjg>|J7E;>kL0 cYMMD*5,hzA>t_Vky:`"\ݜXG"-oSMR۵+֨Ծino 1%L7^v} <ʝ(ך#:6%_4B#WaQ7e] z:B ܄},~^ТƝ,Jo:6\s=_́nR7K%h~18)aL&84n4s KvJ :5v׾鹙E 7Ţpi⤇w`x+cRvUcQUqo4 !_IsRG.|(cR 507)dμT /|W{*^=Y璷hn)_Y)S%J&>TҥQY:,t4%DzBcdI*'&||KōS ח;nAG|/9_zO_S 91v.fcVIޟZʄ2a^!X]jnDh:vF^Ҿ֎$ngnAw/㶯UDhs>PZ#ThJdʚRcKGlw|q8hv~K8laɷڣE";l#O=Rxfѣ(IPK"V^R9~g$V_j5E,NJU Z;XaFkvUZXsYYВp9ANme[m? 1^xZQ/if:<>aZUn5M4ŦݏP ,{sǶGsգ݄Hy{ӺC>bY?h6z wQuwޗy6Z6Nrɓ̫(^FuR~=1m 0uWwMwxnq-b0M]l|_7^./ .i:n`8]Wi4LܸƳA :_ SHw/rی+Jb}i8)[B)٭8lY@3Nu}?x/>#,>[Ȅ@y9B@VSõj;as!ň'<0Voo(RduϫR-ZazqqWUMW:Bk;\;EL?]V'olŚs5ݲ^ 厱LsY`؎|sBE$[*/kj9wy:])Kؿj,\GĥBElպ 3AP]WQB cK[ⶦ՞8"p2ox+sa V8mFr濫b~"@ST|q ~wqߥ[g2og/&(//ЗCߑ+-~"\]9 [ɰEUΙ 9[/4څvtB&KS?6B7!;jANg@-Nגl?{"^̂]KO2^$!)> (*dT3! C nnW|y)(}ȃ/ǃ[W6#?8 6ﮋs]އ[W?7_ow`:{.yDO[m&ؑFՆȏ^wgp/b%z^ȳ;Q#^= | DܛȋZ+mH܋QԆ 谻bƊyQ!Bh49V| 0nǛm!wҘOzp8.N\xTc%Q 1 jMϑ̹Sd4"^UP߯ OA5H4E)'QՌ,QG)"ιXdHk,/zdz4}g=s6m]o횺_'3Rv!cXcҠp9dmKURGJ|zl_oK]m#·OʞJD [DY-]mH@!Q g%B ٬ˊoHIxfY9֘sRهKieF`ƀd?fu5^]CO a;!\a> V#iFA*QHF 7RTA^s;=x?zv^t΄oV$ Plr;(QuLRPFI|ug$ )+U, y.[21ʢcxypƄYg9PN/`$GPk !%y\OD bF;R2daxh{ث@*pb>`~gYڣKI\25D,CT beW5g 0#sr^jVgP+FI"N5N0C6(vh3f-&@|q'2uBC&#e/,''ؙL"q!:>dv*AXaxo֠#Q,!__B L(uZ{*_g"r*ʷ̊v#Cc01^Eyֵto1GRbWJ:++(;eS9'xzN=V`7OE;:jv;:A煉yEqm* ޷@B b)^򺲨l3EIoc3킃 SL9Sb.Z*v9G ۞Eޠ,ث81#Ct bb-)T2`KY >?";*cc]+7Mo' ?vOG/ڶڌ}gQ:TQW+m}bA`U2hFy"%$)E%Y6RdϑjpKpl6`r-e (]JYGLT%ΆoYOnnfePOդ{æh=vEom7<3%ƜoO~| }5 ƱG JRc@H5f߇ɯTBGΠœ| IX)E$H5 :/YfkEIkИSTPy >gƃԩ {^`}C]=9M@(ys?n1j[i=qSyC^U~ y5dY][nCmn~MzTfF~-oe?Gy&yAnf.,D EAbQP8|Q(_45נA)"Atdt$bxF:Lu 6g%t>THBA2MP%@^h:΁6+= M5v0J1o7(LomyՁz=/,>km>b>YN$:ȽCī[w]{r3BIչuí篭ytOtjtl; lzxQ??y]%ؤ=絖塞o3|̳tqsCΟxq4 ;NOL/JZa.lVF[W$vrM3Hsn[4W4Lv`C[Țe-iA5@),Ia{rlhPmC<9r|M價|ȴwN I4B})҉|HHS9UH滘>a6l|k {x5Usq2»pzooT5~EAǢ0gj%AvPʘԉAx\L֕~r3jnG4΍E18 ~!ƨ U)E!bNY0hHh՟8Zu}(釂ËtCuR'o")V;aEh ɾCv۫aN~iݿ67ML]4X,RD݄ 3do@lZi%/F@L bHEl5 tJ|4;i%^ݔo@uB<m~rvHfLU4cvZ6KkAfvR .I=y<>.0sn34|}y}< }Ț>n;ˮ4o)]E6\7I5V5Ӑlҩe:VЫ^_>gBVfC>$u#MoGy'_T' m]/랁I4kO&vliB8y JM%BJ|BK tf)[ڐ'ąB9}XvLv/bY-]mH@!Q AI PF>!B6R2]+fQm5+蓌5}QT6E $@^A.,ovW`oA4׳s&"fTȠb F}d2ZNS=#QHYX38).[21ʢ W/ΘUc=l8dqJjf^_zGԥCTr 0 chj$d3  x|"{T_,,#K{t)S"2epP LE@ARC:Zq6NZ)=pzWR*9dh!8#lBkBm%"HOd4LFʦ_8:Y8yvw{*e;=W=O#CiGf J9QUae%(]B;n=39S ;đ!1jawK&_Ι$V)nn)`cuMoceɡYVh/ouZI>}5d tHmc=.*ּeeUu~q Y:AvbM~;ӗ e_25,bY(p0a{94 o?(K)1Hy,X r{i(zQ\ BY9LG eZ30뵌FMFS)M cxB7PHjGۣ h'8}fv^{5"rZ0E򮜇" vƪDj3 wz2W}˭]sh} fbjfac^?\L .uuh{qU)\vf!,eQKknwM._c66N>n˰d_ Zu⢘4Ւ eRrxZ'_ pa2idTLGf ɴB0 vl_| pThrQDB2ت>qrBOhd(;}8@瞢Tk U\'BK4PDžV2tNhaԨ.8a^s.3ss|9\St%#,0"XZ(v H|W-+5-ӾսF`'An]RwfՃ\Y*ߒ]`Zc &p쩒20JC4cp)_0eM1`#j5JʽN:!'KJ)#)x(1cV+y +S8SHVTm@ɞ+7v\JlB Oc*5(K)µ{dkYNGoMOVa+N/[^f[0:+ZLhCJ8 `8 * \9eA32byC0 H0:&(fEwZI% "B'CۂcX TiDjbtZ`}!]ğ((:2@s'H텏k@ ާ:Y_D˻˦LS 1i &$-;L$XDWb$90VX4%YYR@:Ju `TF\i幏+i7(J5gRఽxc*5韫h/8?ڄeZe˴$0E95aI-ǂw;'gϵх৿T+YB;!L]>;>e|PiXP*9/p\hJ`bh?jY*=$àZHjVUv8e"ɇ8;՗0L,)aָOQR@Ը+ůVUn+H^ %` \]U^+,-op5ƻ\ j|4 |gpH\ET}1io<CYJט0ϒп#~E®C46Lx+). B]\K!L݅)pcGwlE>W jGu !JjQIUŇubxa`{v>`= IF{/-_i;j{ #qboEi GXãy4)V9)SԫfJF(}*J00\y!}Ԗx93Tߚ׌Yʋ$@]և.<2TUߦlw]OAaFAcL1%NQk %IQ!`V0/2m$FJGt`ZD"z0lc^'a iU3p`.A9AQNQ)2<xPA,_fe"iQzHL@:SXҠ"NFbR`Qk vL5ŽUkK%*V:ed> gV4|WTDB ה>Wj 'kA*@2bWPQ{# 9H%Jc4*؍(MѸQ1\_']Qூz`7vRyi貌>7gli2J>v+ݵBJЂk̶ έle;.+/ :#W/ g?0&*Y^7L-"zav)7GԾi 4LAdmq u]Oa)i%N#/feYDT]Q-!~Ĕ <|4?QmN|ۻ{frOmKΆ_jmu#[LPjx'Zum|7=r!&j37xK: vס쫞 öEʫbQepLZrnq-hЀV zp6Dx0nF1ZdX$P{r.&ya  mD`f$i2#$ 8s.ކ!9'&]f]if0[@?FύKajJ-7Y~-=s1$3]2'^n(UG]p3,rCH3S:j琡hk=D:r1CʨE$#7}8<H$0ڀ.P2δKdZS W83(* %B# bD$2 RjDs$rpPTT6PZg68p`&ue y!iC3`S_Ky#HF_Gua=96TMQ MEX$EH" Ǘ/fcђ>$$쨝(M jn-TU k =`2|>HVJ 9K<"E-_@0qR'-{ŘOV(Q; SpX4I1(#i*Ƙv:K7U GWEz$Uz$dG'#C4:FB0MYq)ѩ.2&ܑȁ7{Ns4 #`MmvDhnԔҋK*_5哭 ,3u9Z)XϚqQ%iK(MXJQICW]U+jrv@eqyRkptJ3;eY,yܶG}.]^1v$̀DIW[$$isA#/.|&ҖPJ>?m|0ғ0Y7ܑ@JApIJe% b\6'ȹWsk% 0Vʨ!x.Aw4AAz謪d%Nr챾=!B9gN޶ ޏֶz ! ML98w*{D$a*8t'399neƎ*V;6g}z<~|xkyf/06;_b@;큟B[O_ǿk4˪F lќ Cu}ط%;,:?Jlxs9B IzDž11~;s8ugm5sim6/iZ 0H|Q4L⛾ >`6cF]~l~}=~ullk [*]UnQ.tk/q^77wsC6duŸ́][ SXCۤtF[Ð5R-J?n L5u@?uT#BTA.JˆcŹ}ؿGQsbRT³,Xp<'UdNCjR)xLJ.%@M)z༷ qA6Z̑읕*K5P զs z^&,sNߞPJÍ-vwVs"OOn?`=M6CfG{fcֺ=C^1;s"9y2ɍh N>4_h?RngWl>sX?|dڮky_rrLW-^xռ2y K\P;WS:Pdҥh.nqI,vߩU>ٯ8_͞ =jևCggh=YofK\㭟`7#:7)eh69g8L\{*yy14W\+C>Wd{j#T=OG ՙ{!UtGp +tJU(Kw:,F><-ʢmtލWc s i幑Wx7T/E؍Ea*AvPjAirúyQj h <_G2F A6k*&ƘߣճE})yMNt%%ATGF!3Xɠ1ހN/K:qӗY9j# g;(0 Y;%KߋYq&j&B+%Y3",b2,ٯK5|mZT=ԃ8=ݫ64rƐW-&f@n49%rI 0E}7j;ap-֑s!!R+O ߌcLP4C^Q*ndBR\f~e7:g>ʸ5t1b(O銾>bwQ4|5<@ls 0Oo@`& %|M}Xu`-*tI񱘂T!:eF$t\A2̱GDlruu3 ]ϏcxvZ6e[X;Hl;5;1es3޸4I^/q%ύL!Qz3Dh1ifZw-,' ~aiC0NZ{&WHa;&nzM.OF|Mq꟏~i9\jYoS[R˥d!ZEe ,d1fEѡB&`$H .i JsMJmH*/D!'1B*\{1 Cjٯn<-.wɲ} ٥ug.:Iz=ӭ+Vy!A{PŐm9 >N Y 5E12xeM&%}vHLO l.EIQs- \&#U2V~˸\V#mj5r [x= #S[ֻb\;xY۟|pҽ7>~ޟkV)$؂ހ+ͬ؊@6+tLVCU-QT? )ٔt] ^C4F˔f69̂t[l8+Xjqv`kq@+6-s,hM}<{t1jQl-^zGΘ6@PR p \GN0  PrjX^zM5픚vv&^X]9+}.\"#e.QLJQI'y_~#!>?Δ3}Z?{PsRto?<9\WE('i"OL&PۀV`e2g0fc?xRm/O f,=eЮ ]yÓ˫'~-GY~.YҴ=6x>i? ?I\nç[a:B 疌-aHdKq{@ߋfRr4JRlC:( ?;_Z8#- h3-BOjp? \ T̡2\UJ*nL3!Lq0+Cih'M~ZXm[W ]jTf49sᆝW( $-FgZW<45ʪPϟ'ɟ,iΉ\iBhPta  P2SW/v x9F &!טd-6OVCu MIE܋vu ;tǢA}nר0+כ^oRoNք--ɷ -:Z/FN%Xڴ{Wnjh^xz5z߽fn^1{żbnDֽvէ +VgV]K~`cCW S\Iɥֆ~IJod%o[nOz=h&KVxom0V_QNV@NM~t!O_[3O<6%yر@Y{O2r0:'%5e2'8tv\h ΆxԅJ+κL:ױ]]'(oN6N*zyyr&e˝8v> i6`竜~f{ Uxwٿ;B>$I䫼pu;#~Y'^ѱ6Ti3&o~]|镥X|YBAvɫk^yk~mDKЎ\bZ˘BNDit٧@ c%aŒv (F#c`c!E jA9a\FSLz2MwCߌK8i71Sc zf8]ejhWZv~zZ7X?ت _-+v8^^:|MU(;W]JngY U oJL/Z /lU;7x]āA͵Zh9,PX=٘˴|Ү*nb Vy1 %/7#`Rmdy/ڌm-jayӲ^\]mUxYhhQs^ Y_KtǺlӛ5xZҒvػ{WꃬhڊIWU+,>9.'BLRyp@8*RDG:y+1å6 ]'^]zi$le3)A½šF9q*ᭋ[l <p|GS!uěp ~sx6MS6dvH|4W?#.CgNc2ld2PB(d|\ mx$4N[9O(Fc\9"*FhG@4\$ۯ7`"N[C8kքx]%8ygLXx!Hͼ\) -flLˆ1F cwr _ِ7AZusuXÊ{cb:~!u0*B\h -m+xf@3.dq ||AvA~OR8420LLD:(fIP!Sd4"U (+v)88Ȅ 1a=36pٌ7x O'O9 "Y^Lv9jYvmU5خ;H1/aM-7cKͤVű%h[* E < ȵǦ 1jUq|bT"8On JF{"$blrg 1sr0"BƖe)1c51(q!$ >6Q38 LVcƐNE;I[d販#h&R?@} $2-dNH  udT}_)n(8I"bA\xin=B$AewqUA>BѴ#{'}d=k1wRPR2^r cAbZˣ_ZE)Qec|K⦃rmOj=~I$`JΛNU,Vaj>Rn Q JLOL rFX!>Q)Pa4\Vb٭ TDjPݏp(ڱ)ԼlF;,N2vu֪O;jA,ҙMKJ='=:`O:UvUx; oFgO\צPh[gGh9_H y\tn{L%֞@Ẋ̕ M'Az uw?{`(sX՜ގ8}7Cs2{?}R.eNf/WX m~U~j<;}j)?3uyF>͡komo)jO-s?zqίf L'j\/MI]tV8Z^#3a?OEV'P?B35Ԩo)r-,A9 JTPSQ-SB\-~yYd[q2Jp䵦VD^%3ک^>ku2HL!OA0%(K"[86|Y9lP(?)yATE1|!%)As #}Sq,"?W9eM;,T &2DD4:I16JE*}>$K$(qf DC6XbwH.bLfYР4)xQNk⍲VGαyTC JkXJ!B0$/ .iYXA Raai,,"*\7uKڥ4љbi ZR>I2>{ן^n9Ҝv'e9 X\a DI- ,sy)$zqġE_&"0Q]v5>N6͔1Eoq4J3Y02oijŻ&g]5:TE̊lوxX&CjnP"UDҞ &Aǂ'MQ>AAM22F`:(eJx%BKc ;!EY*Fj&&8a.x!GBa)5d.]uZl8{gk Y_mN'O5=j`v{kP8on\ߣEqKէjPNzzgn6f޿xZZf .;ȭCrٚNԲ1Wm'o-4l~?Bm5.tepdt)rk.n{meX3RWhF f:N V>wpY֪nI0ۼg6 <\6gDL^=(ΠM؟֐F4Ľ=Ӟ4yPii΃(N:V*[v(EU݉ᔷ㤀oi E+yb":GvFqMc 焧R=q1R> U)T%'~|*ƛR˯2ߟZ>CyAtkIs߷Ԩ~^wLLxV "^d\}sv\p9T>#z5O{*Sϒ~QΗL23[F.ZcR1.- FLy0޵6r,׿B{]` A!@n` 觖1EjE{(RDZ>iԩ:US]ŕ1[ @~0J5COQ*zdm$'.G.A_>{k#aHB4',"L=A:nA]cAF!LxRMڞ@nӳ.Zsi%sd),'L^闉+˘ $&F۴KmZNoֶJ)gK3L߁92tY0mK5h&ykRڐtiN(5@"# $%dA~0 <=@^qz|]sØ2@P8mYH#=J?=_y1M7niz-Ӵ`V .Bj*+!Rye:$':]@z)~zfc&SOd-Hr (c&])2=A-^kOk%ji}.o,# 4$'?E?0Ww[,q[QAFi!aPDO맕0 XN;^3+?΍.~ۿvKMF tWnq{ ΧW7r@ %swƦ[3 [3 Ki?@<Qm\Gq܎KD_.tpg0I9Mi<] * q!z:umP0uΗjϷƇHٵD \ v5FLJnUV._HY]~4Ar3ЗH ig7F\De|e(oB\@*|ӊ %H[xx|JdH(?agdD8sm-:K#̏-5%rcɻĬi7it5Zn5 h:qvu ;ܸu#:a~*... ==FQ4?Vo-,0$p%ck2r9SvyCG/VP2̲ t ?n4 MpMA323v./מ/4l1|l{ٻ5_zյ~lQAks3yiʘ)3)I]vx"AV[J&<'cPyk;x tCG̓LAIxN|*݉ g+G8ghC >\58Qn&ݾNٹ/ypleXӰ+ 3C} {A#̮ K괓 17 2d\J42iNLu9ݎ})d& cCe61pO9/xkcG%bVzDI_S>꺠=-S붫tUg^9Gz5xtۈ'ϫzmVVTڷY@\iH"oM$r͗hy/٬ͬBUDbnfcu\_-qfQigW劕+7Ә?Fɡ ,k%6{n;ЊLJJ6Tz$𳏚ps@iɜHM f/m2^'SI7E ~nKĐG!:2LArh ocٛ&gE4h()5ZXP?T},h{hMzV%6k}}VLo?5_U浇Ll33 `U ]Y~7 ΞPīZrik,6\Yέ֩2)m0J&̴ZS҄>oEѦ5x"y:ކ˝(_^:%WvfnlJ#bYVg j .{g+ꭶԣSlAbI`TY9i}Yw% Yb9q"kkB-/{a@( If#br<@H 7 )AP^F53ަrҽ.&wR쪛5ŶKkor̚ū Y,WčtrmOٲbFVhH&]?o=n$ݙzBTxTwxq@b1dENG㼓9d͢gVR<(o"xљ@=+qve M8L W/fyt)($LJotvkYʛ$ۡPuێ-FMoKxyK.{_{׆yş hhthNq fFI~p0h!b!H,'ߊH2N{&HP?9st]^aF]P2c3GЪRf*Xc6_sy,Z78UkC U.x#y%HnI7A"CzsI;GACRcc1f 04#$ YiQ#H>`0@dB NCƶHHH;$NaŎɳJګѿ9o+YDs43q{ޭW)5*%"mvKܹACˣ=+,CJ28t}'^|B j߼]a],KlTƬe,k {J~=wE G|Bە|mJIўP>QMB,}!r#4ܲ|eb V_woA\>T}?b/ OWϊޢ#?#s9i^4r4L,*s.2ǛdUx*]b*dǬ` yg>{m IXoChAetKg тL;ԤMpר ddl _l^ R^af4}TJagDl1׉.h5 Wv ,$%BƤТ E| uk VAwaѭO?|η?0at(5d)dW#ךdGLJ`!hI=y%K`WvXpUZQx钾@ВB5*[ iBF-Q iB(j!>Ҩ4j!ZHҨ4j!ZHҨ4ƍZHҨ4j! k!ZHҨ4j!`k!ZHbbҨHHH4j!ZHI BK^?՘7PR \F¨]_^}ak]hn&_:=|5v0L?8Š--"/[Kˎ{7 (3,\&Jkr[H`"S2%=  l> t1߇Q{>f{|~oz*Xi-41fEUT%n@c*Ĉ:e!rf-nF8ă 1 xv.R uw Ѧ<-i$+۲@F<~3n/ڊeǿ+27 퀹YWnA50T1@-"SkRO2}=AX 4u=M;&-nneʶfnG,!},!6+lJxdaZO|~dFss2ziظqcw  ?!fxJ̸E͖-;rz'1q?XFڙYb1#wȅx~O~Ҋ.胝T\5_ moESइ;q2Hu.\;9"{H[L1q\^eLtKL49]Ʃ6:5O?L3)UUst$5=+佴Me7rdu`&~S5?7J_sAi=$O~]jROMjBk{rŗ5 T-WJl2YTw^3.%KFMr|tHŲ-al0ǂv!E_El~wL2I5%ݿ8fǼ;,6Ϙѡsdj3WV2GRr5~}I@R`0Q9it5\Nj .%E76w/L*&^ stCE9tKmr|4`gvWWVL\Ǯ=$-lqxwi헸)ܯÒ3ާjJy;%<>u8-aHɱNpc tHd$9IpQJa<+ƔdQ#SL_u ۶N_34hדeq,T \ĵ"Ԇܠ6!)! m@QLxklbf5˩Œ$5f:IP^cS,A^6kz5e?H30ݰZ2O{/x8<δ{aڨn@[?_BO{[ڶjA^hDyN"jIYZGí&#㍊>߽tzG.( ﱣvk4&go~5} }5/0;$|G>W?"- k> Γa'uOŚ-lhĮLiiuګXX-b![ςFVe#At7;M \xd[)39 {i-@ue ٠bp.8*"bur*!HJ|Ͼ ilK[v7HBr E/Wy؛R*_ !t{tݜfFϮm~m~^G_[C?v{.u8œ}^CBUf^io4'چnB82N{yxOfuh~t'm=KS;;:(}W z=X?zܔ֜gg껺VP}us8?\i"ofM2OD+ެ+I)C4TX a%EE˵ {3zBH˧N7+vXqo^" ƇJ@컽V<\v@h3֩_iHW=8T䷢m+F.N=Chy4"$,JV=ZyEӎ#BM1SPzXv-tAdikI[LBY:;]Dk\P.+E*Z=ƔJRHɉB՞]AA8km,"CцSQ61ifOv9rfI`&+t⫏,QD3 |tp3{ b ZVDHɱM\ y*u78>u\_Hs,c!ⶽqs%@l=C%>qs@$,達` B52\q,ev aDF`€^ (!@-َ1ɗ~[ڊ;xiyT9EIjQP8a1K("`YrZֆ'_X~+B$K3(BL^OXFK"-aG% f jpκUXU^QO2;DѦ$s.Vj "|!JNɚYx9Y'tP5NOȒJYJC yLV\)5k+6pkDι:Ϻ8Z8K[4Dmx1m#1n&٬@JU2ena0ep%ʄ":q[{[C>Z#ca!> ^Dx51qwst!e(hIeiVTyIIZ=c%cqtp,;ez7 ?ɲw!pH|A6>~XԍWK>ů]^z3ԓ6x'z Qhr Z JqD[Nʤ<51(ԪUTz\<6j}@t09x>f/Ǔ/ڦzNȣ]٫bկ׮Ra88Ă@<*E2Dt<>$StYx㕉#rptJ;IQӤϵHv*zI–!=1UmqwhZ[{'_nuFۡWtfO`0ƫ~Ђڗ&Y-}^ g0SN& ڂ zǣSsLN!,Rǔ 䗌B"V`()JQ|L!l ?nftW(BIH %G.ڣo>N7u|=48@qd)&/=*>QCdڠtFQp\pTi?lUX0@CRtYHi#J!Gq0HO&!F%=$Jk'xZXeu๤R U &g_ 'a \) ȐbA69S@w* Bk`p7S xp:~h*}mK?,󶃢jo@hj[֓{tHèԳ-ѳoEo bRxGӦϭftsd wm2׹qcVuNh' >֌Z7٬i9:]y9ej6RKF-ٜZv>˝]T=z^iww?y'i?%;ԚBvkL$MuZݢ΋ sd2V۞lM~jjSorݿ͵dPyb %h٥F1?f]ۻmn3ӻp2 zA/m^nT͝ҮU<eY0/Zfǽ9wۿW-wSe[ez 3X/P;m#DJGCр03.#Û8K~Q%J)Jf#R Z=ZuGVmxAf'WZT*I^&$Vi$Pk/:.bX܊oMbU~=Vڽ)ƩXWXKUGf4Is&xR!&1 EltL|0GOLD] _6GuCO̢|sX6FuC9۪A>Y r?S-7ir XHpEH`G'9'eԚDrTg: $Tpxߎ֌ޟG59$zMgh,h;3lLK%2mw !U#bD6J V4+" 'B-(*;^un-e% fn&r ޱ4C٠bp.iN4ES ! l3Q}\ mi|y-Tyd6Ka(jIP+i^y"Fd)J;ij gx2J  V U 4J|t|81uNNô1S1R)R*9T08WtBGXat b K{\ںn;WM☮cGv Qy>.ᓤC6c:XOC_m>LW i:L"&Mvk ?ٹVm6! `@*.A^ Z%6KĬ %r2]@*2`acANJP"Ddہ ='CLϤ(gs*&֖k=H d\”KE&V9kdIj =m.> J]_g?jacA_e[E^{HmDbJhaKl4~y} K7&mfP3DfC&U;cr/a0ΝUW+^ed]$+AfaǕ [x$G?{WFd O3@w `;ۃm4򈴸H[^H^&i^imKdVUVV/^F>Y㚈lY׫:t-4|m8rJjd7!XL[ uY#1!( Q YFNBK7$eٟʜI)SB#DptP)lt(~&=paSM599jSjoyA-y~3bvT7{ɅT>xR.p1$W9fR`NR+`1"UښLJ+Yum-M8hI<&-<>mMge܍RV#mr|ڹ}p$~y`3A}?>_-6hZyO@KZ-Л[9R@Mɗ*NVR>q&B)Z-.\d]ҡt:G,[bW~Fô\ծ&Zm^jjwvg_me]FXRčBtA4, 2I떶JU )B&d(иR֗E͑ۨ 8|a췇~^sX(*[DY"VievH [QI.k"s1!h%j`)o@oSuvvvӳ}jx{XxUwaw:f^l!q YBotbd JpǤw"R_}4Ǐ^u7fz&)zmky;ϑ-QN luxv.>~iZb>n7ym&7/;GYetzszsiWOSuc;R+h7%X9ݖ iEh̚ n}.pi-;ܮiS̷S̵Ⱥ-zuƊ)++O?NLdy1mCW S\I=˥ֆ43&YְqEg=}Ѵ ǝ;'+,&- pWI+ 3e5ٿym_a jW;f4s'K =\s; }qb}l4c)6[h6jx,NFca;;c!S1SbqBsw<ϖPiYI@':}VnCsA]BQmt[)qR!7yWI**>G rLa68#- +uѵYF\۪Nfxx:bhwcl\ںmYS"ISo~#^3;UI@ݭ5:Ȅi"hH'xMs".r=O?7"/=NžHůKG9_Z7 ԫ/nr˄\KԞȸLk-Rv&"\6Uam$.g.Aτ_,Jr vC R#X&kC_DK҆gj qR%SL2MClrD=r~ 3`gԖ3IzUN `+BYv)!`I{eM LX.$"Xe B+gJ@ݐcIV=W΁~aM_|^]@_jM?؛eW"TD[)3S2J>?ziXV`yƇTM)oֶ<3r:SR%Б("9 nD$8΁LRǐ6 N+NjGŗd>h-8P:En4rcQKQn4X^f뤊CPE7S۠g1DHp?̈ nf)|ke=DeL@HLZqK:e3Y *V `RH/Uc[KsEiX璀ILRdzF`16h)i4\\dhWIf@/ަN |dYö$^ܴÒ!~i[DO@ʙ5Μ.~{ Ŵ9Ӥe;1&W燩 hf4We-Q(X|>=X}tMf5[Hz7Mrz@b\?&ظ8cx4 {l_ɾ|qo4ֽnjHֻiǽ_wPZJqE:uWګ!2v$׽]N,+׽X]/ԻmA$vmfDKo8 \[Y)Lj:7\Ե0=lxZ:i%H[x=X)yyρwGC2"ma/}̾k5C/@ȜG0x,Tynqttb1DOM$'ZT?QWEmMFLOߖzГ. u#2[)Xi85a|Sqޯd@ݛoEiB}=inJrAnov(SdzNO˒51^sor'E k"!Y2E@#9s)ʘNN0I|Xs(qdX &?ΟoX)a&}^IAЅzopӸ>S_!^ob1c]G?}6~>`2{yDO"Xm?"ǽ"Oh'2%=>iVW:$۳H0/CW2Gz9=k6W#nEy2wu+`Fg8þ4l&Uedm`;K^74<<7+:lw1[, 9bQTsqr'"kdɒicI[8"it: 6aRbSPRP>d0T(Ne 4ɬT6fцlM զ?{GYg/btnol+:r $S<-fh:$]LE!SJ)u%'k Bb +m:/TĮ9Tt\~J8dLqg,$ Bv skVL)cyX](sd^(!گYkMg?P+6Hu7_B% 6&rKŭGV"i.aNģuȼ@EBb Lr) SYx4wG NϮBgAY$%G)t8ęGIP|U3ϲ}H4Yr-1d6[2V뀂gզsNCPK3zwFBkqhAv^H&x9WS*P`X$Ѧ$ "<A'xitlA9&븂jUZq<.NZm!1\++Nj̳dP1,U]^+rdiQzԝb8Ϋy-NH!2>M[o7+zgH,x91~$ytӶz[T:^3Rȹ(%! jycs>'g$FB@'0VA2Ĥ-Vcƚ8{:&!jGL>W>zߞ4!V{ư (9!lXƹ 3|!2\u/zÿu[)A2uE"JT*+.81O6S:zBϊ7d g :dЊ됳6&\ZH sNvKX^"_FUE'kFөx‹GxrQڦZݘEȣT9 h}~ru;dQ&%1sC>EL9GfU@`w)PRgz BیؓRjǕզ_7@Gqo-Xl)&Qގ 6~'mN r|4zc)dEƧiMm6Oo~w{Z@nM!{(.*Z1w?/ߊoL }CUs'Ql+A#ϘV9MByar7)K`ҿkA;RLBLtí -Vha!t]|1Sr˧ԸK(r]]ʧ ʧ|u۹MKx7V/wG=$R \*l>ތ錵 NVjBtY.vh,ӶE D9ub\ IzGj%;02 S3K8r.#8ŹJpJJ1E/ K:tRt7:>n3Wԁpn{oFh-E_KΆ 9.0Q6*ʪd"ȎɒID *o\R4Ơ5ƕBvU^!#9Q (ư9 f)h &ږ-sMo91pn7\^47] @^I\9@|'O萇$.vaG%>t[G^§w}=[t}$SyphtY41Vp׭K??6[ nͿևΏtis<9jvrK-JZO;_ڜ󣖇žopLÜOy6NKlЪ#B \m@F4_7=5k麪)-YWc[..w.R6λMg+MM2z;eD"P=!{rVQ0ԂhZ,ޑnL>; z64%5[ykϕ6=uB7#L |W#*m2䘤Ό%[]}Ǵrw"k=Z-£Gś%އayoT; Xa,=XC_ A$Pz}7}dn.&Ns}%fW4-aZ${}06cTZ`NZ$\VτVѪO-ov%c)H{cUrKb@kI'cm #UH./:uigSZcKHՒ·Rlp`AT"ieTZ )3HE!U3,н*" y8Ǖxn-/f~Gn+χN!)3oKf^ Y-i۴DTW}zdTV"P^Gc=Vkm|g4 P̭嗨֘\R!7-KM?9R5çnJ SV%&GcÐ(plyϨJɬ¡>$GMV[Ȧdƾ)Ήf7R=d]J`5Eu2,>H+]`e+"=+ K6* &z)8"_eQ'T(f6)h/WyL9{R2X}A ҽa11M IYLqvnh%Z1:PHZ ozSYx%0*9 fPXV J3rR(ma9d!CA'Y&kR1j B49uVRw#%/#x&Bg[Nw,ћ ǯNe3too:6&)EoK:L fvU{ EUE 2 F1%($3eDtzHYxTK\&c D"͖$1 C@cqCVy&9EGW2l1L˻j}PFa:B&}B%r(N Ay:_C5AպYf-"ϠJjԈgE,GbgyAbV u]3IǬi rIʥySEjW/Xy8ǽLC';E^nN)sQyHoc$ųX?$ =#܍Y`y]*g&9>DPc6ލ`;g톂^8_"\(R6KZ%B5`d$dhB2Pׁ.dsPR{/ZAʠS`DN#JH}^m8ǁZWw|@7z>_.j ݮb~@wG)rwGP|V "kncn J1%Iت%TT䵟OG=\M)H|j /#0~|PlkWzFvJ+$ FTUP8)jHuεbQY㵷g$+FChP}ASҒ-HŎG/Ǽ,ngCM&_ |rL{4(|6ړ)LsE ?PvgT)Qnoh۴|Po}~#CVk179CÒB}(Pգ?؀M !hGoRE'W8&k)m??;-,d0ƂQ7P˿OHQj B#%QFiX>a,A8|@ec!s+:h-8o9t iGBڽn.V㴻 nvZ۵BO1ŷeh1D^c"j]r$BYSUbVS(TcKde(jp `j-l53| &'6 8T1R$+{q މ~.Egom}XmS,++Fo᧌B~fioד9MsE$ʊ?M~+wouBi\O^V)0x׋&"VFA|͗&Qyb5NWrw93?񁓄8MJWi8r?OkeI7wݦQ=M'v< ߻`O'˗YLK¿`om[V&-9km`%_?M׷72c)y'S zC `d *Nd ]zƇ--ש]"AyƓҵKҽjqh?}>k4|F1a3zZxO!uzj /qD N l ltw |0 >WFZ~mJhS*#m$k)B6RsIW+J*/}(:D1@,9 f)h &ݳ]U;]H\O/6u<ȳDͷa)y nĞE.`^w}6g=>٢so?\G :ȃC]LM)zO< jeZpnX>=/t~MW[ n)Vzz=?4>y%℞<,Q0s>E_=|9c­Y ڈz릧F ̶Hi½og_[6wή׻'{B@`L1VY#%)uzॻ"+Ws^Fse ~G* bBߑIFM3ZnQd볩wЫ%~||J4Qwv -&m%i*nMcД2y""Ze; lM+ JH@* \aUTmSJ!W}8ZYi[<:U%J}6JA]Wsb~~֎GM{Q#nuN7 ZuیkA>Ye2O[Frii1ػi`CL{0(=֚*WJ(UcL|±K* %>i#'GR*fTUaʪ(buleXiJ[EDbv\ 注2XEi #5 yEg8עtF^D!G L6Ě % 6 P5)gCJ&;(NJn$eyO"]zl Ј?>ygFNJ +R_E;%'y '<՝mdJ&]R.nң,{L#DD-HcA"PO)4'L s0# >>;:vߡ<voGD]k,X̿hrL' n׎{OrOosw߫_L/=Z[ra}byW-_Tй=s=@=L8F#ٍxgC<N٢\#WS|0+Z ("E%d忣M2H49?ٖaG1PZJ:Pɲ]u!T[&l "}*FNFJ-hob>vu;bn ,>\$MQ:v7|1nLUxDZ7{fw~*n~gFVٻF$W4̌<z00XXa)qdÃYj<9Q;t PnQ]ͥ3ܭԐZFßZΆ|6dꬺp2tZnφN&k.~\?*`J#iA[rR}Ԥ?G5G9hQ6FJN6,1]]_̲ɎѬlV[پ_H" Z#9$@^(\AϲYۥVRfJĭ]2~sM> ?ߞP`19o-<%Yh )eD:8gCQ)i9*eYLƜ\RdQe 姄^DQ#@R0X^m8U/MBa)0A RQ/gOF<%Y}jwRT>xR.CF2a)061u{kk2)Y0 ӃT5) ds`\\IHնՆ2nG)ob IƁpbB [x:MmQ[_gXR{}i??9ݧAk eVR%-K}[@1R@Mɗqbⱪ Ԙ+OAD1@2hp2$벐9bfpv[p7L Xjqv`v2F[$0 2 chR#w<YA&iu:YicK rIQt4dQ 6jdhT֢=6 g;a<XM>P"Bg;M5pPB8BI/r =cZ1xҠ<֐ r!ZDg9HQI.k!)r3!h%j)Ҁ7\FbBՆG_jWْΖpgJQY|<Sm=̀9mCv CSO,!8A42x+YYR%o$k"ZBǗ_}4bVjdc9i'7{ߤސ0EoTZ%08rS}svd:KTg2Lώ^y]Z}ͦo ,زNo:9IygodUA [J ?śݒL\nZբ#_f.pÈho3u`rKW;\bvvYwA.X:xep;։V{p;ŕԳ\jmH +y6.~qgNʭ;KNoIV!u$Iṁ<Ҍ^aO[F4u'w^v\s=L*gN≮fM-;8=1i?^FF1bqeˍfxzϖPiYA@':}-75x>( u_'r߁^v͛bQ9sYxN) Hu`>&*7~Ą)&J.l:y[uҔHqz125*X5]gS7#%wo820M5ҁhɐ,Md Xqn{#z6O{*Rϒ~QB:r [.="utvV!K@jOzdB&'BENqXp$0) Qࢷ sn%9sq>q :~&db^tm0GRX'G3,"L}׆W# 'eC358R%S & oTV8o{xP;ԆܳIzUN g9hC_eE1@zQs&5ukT2b2{D.K M`][oBP(gJqD<'R DՆ'[&b0#T@[)#SrhGw7ݥM6b9RXÂS5|YZB>ә_r_tD\q_[91$&S!Ibӆ&G۞vʠSF0FAXrre&dsC5hI  5ˈȰogʷV}_Xƀ4<*(%VzNYq$!URjTSO*wJ?2kz-״"IF9C.D"eYl[B/E|;f2 *VJȲ`VRH/UC[K D,_sI٤R&[ ޠL0M9%Pj/| ǀ3 tPpћxJ:G%J" pQzXr0$/y6ek_wʚN.?O6CA!P0-G[=7F2?0>Yr|{?IEٸ0'׹@XhVs:chwa"ߵi\6?P18??V 6ί4I#%W/8}*M0%OabF>_hǽ&湭PyPWZJ$/g.7؍$f7/rLjqSƏz@'G{%{*9>NFL]\jĺ&߯JbyQ n)(&8 %ڐ q~40r1/kfZK2P9:21 K0U[\?\#̇ߧ{1F4Aq~vMom\޷=g-VU7]jZ4M\P nI'c}=1q|SW Z2M˷4? 47:O'77E;T ϴ]p9?(&ˣ}ch̦J럭tY힅KQK53R2+H9@dn(\$!Vye`L;\ 5y)p&Wf%hM  J7*{@1[#.qwyO]M&nLdӧliG¯v䭹+beXyiXuLLd^nh uQzPB)sSZ7R2 Z*iࢣN!o5_ɅZϞY39ﺓpȣɓsy<2u!ghbp.uTT  {e HeVy &,T.*Aہo =CuG bA2MGR@Kem9s%3$+Zs@NJ766WkU2nCaeTtCbkkW$6ݲn`evww?U?RK wj&~x+ :zu=? =1 0IE Rv6'Fgl@sl|çP[(|ծǖF fɆ~53#/?{xnMƢ z Ԍ PBIuNzӊ1GʃɅG "x^4 KѴmWi |*N?eSYVU=0"O(#ik).ӕ3X;>uZ S)gpXr8dL :Ҟqd!a'&y 9.y@Rbץa`ƐX &hid([@ aJ˒i$kK?4ZM,6Hu_FD)P8-f "i.a'S] 7BbIH8f9R Nsivg]f)9C\YƳI6p -9JA&tH2o hD!o 5k=X+o}!2lVi?ϷC=dd:{+}lP2۱c3̉w3~쁮;/||ɬs^^\l;ȽCkSl1ͅ4w{)Cuϭyt.^;mزƖu'woz[Ýw'ky~ylw\=12Dญu踣sΜIal.Esn:nv[פE{jznZ&z꼕:L3md;@tN3nF ph!hdiΧfR"Q+K7Gġ]6zq%isޤbډlj&YjhA@xAG#8Dn-m>#(AljͧyȻMYʤm\ޏWCg;qzoT劐[bQ: '`[B=(JɊֲdhGV%%ޕ?YĨj L"DBGgBѪC.?I4vRcռ| -BR&*( 2/RNqMׇāO6oQiw&֝N^4O )섽vG$H#$I>Rn Q J '&}\D\ kjчbT"1*E69crVb9KUPݏχڱ)jFgM7Wz7;:giq۩QԲ}!Mm.&yJ^TKb hj%)lwFw/n8wM+OM'<Hm,! zE&@(Fu<)$XЈRW8q9m@}28X+9 k&O b{WTNwȉ(q4\ 0~1/%D/4$$5R 70T1܇ן=i3+h{gպU]ߕg;Qx2`2˲hÑX YSY-q$Nzxh9?ٔ0%@ (czfl $ TR$2FXxJ<8})KT%ڃMA{=sv٭?'PsAxuSgrA-n;|1VL`a9iyD$rQBWk"uͣecUS4"jB Qzyz]'twFcT{x$gͅ$ޡW Ζ;fRm]4rKM(}}s/ q7XF)XORJ)BBIg5@$) ڢ!CcH"E  R>rz(Lk҉"_^l8GQ;JBt3%D"SB<{l(WjU>|Zrʸ(j|2!1]Y|R$8bu^:bQ`eJ*sD N@0>gU!GAĭAd@pHņs2G)ob IƉp53B [x9k~P[VGgw,/Yyj|Wuv8^?+[l&qDk-!ȭH>葐Iu95a>/jf)Jsl6Wr'yPIȍL>&.l vQX!bIǩVڴ=[P* CA 7S` BYR4鹌`+c9eM$!N ȐGYT&$^HF!̹2ʖ.d]l8aK0^E,&ZDV""qKFᨠp6[ʢqK TBi E-р7TqIR Ӕ keX#\1{|ņsDB¬\YLJN]]EӍUo]A) j++J/ʫ}G4* N T|)qZȌL܁**{Aw (kJ0_}WE|Z5M5OgC5BLQ5s18n.0lNlվƳ vs>RV뿯@ %*̊-כԛW?칭7^A6# t+ 0$#mU-l,պUr^-+fo̺zuƊ+++"9X'ZY^wL;DXUdWzhPra 6A%l\]Oq[KrKཥۦKø/f,$e4j"B3ڼ7Nw۹-=#jcQs6{҅Lϝm]3[옠a>ӳ ۄ㔷14s1Piύaųe,TZqy%INE_9n;oE5Iw'/yHQ\t\3Mjn.zl獇s\`Oqr)XƵx. .V]tHSh~zGop 9_~W7voG9vBa 7կq;l5_(P;k0?aA˳|qgFY]5O#_5j> & V 6vkrV2wV-ڭ4uQyskuw{:cZuẋ~g85s9Tjj} ¶-ZYY2ۛ?mS=ˡ:(Ӫw_0'2Ԍ1WskdmlHuv6O9W/SfltϷU 6>ѦL8-pd0U{e)qlj?7::g'خJ'K2bel%L7w5k-c\ZFB9 )YH:*< aupfApep[xn$W..A_;/ >Dx;\S(0bE5Ry3\;[voL8o}ˆxOn*Xi2yEBeVI1 )ШEt ] M%htVhU6-, ‰i㹳jǘ2D*HtC I HH/#Y@oV \_# _f8q_\>g-֙z1}ײ^rβc!%XFCT>!v\(GD{+~n{Q6}*1Rf0v} ?bh%eq0 eqW}U/.LqzW0g(fb};Yrm+*u;9E o5;ׇ5[7ffu ӪU_Xݭ~T}MNSJAא˃Ayo4ڰe7&T*)"6:g;m3)Ao@#ĻBЛ W(CmO7 }M_K-GAs \sOvWqÖ ON^'<o6C_:Ym&GmVX~^ww{UtV#ZuZ#6G7ȻM~;X lRᰏy׷nEK]8ʃ^48 #bc.nb;ko묝'*<=`s_s&}cḫ6B(Fبݳp)?#.wDs\O1Orxk'qDCUr)N!Jwmmz9 6cf / {,Rmi-HqzHJě4:e3W=U_ņK;[ⶮמ?1R,HI£aVAxXQRV%}>CUbPM&*1`[K H?[$#Onˆ\;7]rDE-a]&#]L-:eCv]z7URBܕ.S]@뻠AUh$-uzM~o[X88 v)2|8֟NC.ུ $C=2A\*~2ET +ehflAЅFp1Qs%\`zKujPAz:-_~UX4#9n;䊣';ۆ#!gaGB%\!EΪ#(׽K HeDAMKb=GCkUZzWҟǾJA?}?߮r.c~VԒND 8,:脴O:cd+&e(N.c9)AIZ?&׋Ja_Os@/3dڥ?d>9釓-w'};@sK Qo!{ZΧ/".0w'YotF.;93 f⚥ynVOJ*E0q !rBZ9ZM2}|*!gh-.kč7F@q# nčJF@q# nč<6FqF@q# nč7F@q# nč7F@q# B&c)sH# nč7F@_:]cm16݆ncm V5ncm A16ƠHLXMq=y;֐בHuͰIuHuWi~VzMzMD_gb eHч8rYJdNީMg223?dK"~Flu\JSI uY#1!( Q$ YFABr;ƄkR"l%x-G@zcTZ:a6V6*72 ^}!PIAiOB ՗cC;osWԝdr +Q!vTBt)[sUʻ+I< *oUe,] UٰA 6#c\Q[IB`eZ[KLF)J<&3xIs-= |&#U2Vn˸<-&Uv[mʽ>mS7vWML"mOt-6.ZyOCpKX-p[R) &udL.2-C|MI4$#\26Y8s1PIǚs>r/tkו/{dŵlيցqճ' {~ Cs$Ed Y:|TF(ޒ_ixQIEn?}.&{u_f>,_Z+ƓY RAo._"֗w׳N\y@/NR$aV_qotٷN;Ƽ}t)?-Tl\x)-)6hqnޑs=Sň%id:;O-r;^\e.TZqe5IC_-}h>TzPԴq8V87 rlKIEŢrYxNgrd_[!Ar|[5h牧_bBcg5xuamՠSD %5~ʄ7{f3xzگWi~QPMN'}Gw^V(C); w=`)MM B@Bg`I%@B (it:sZDCu BJ#MJe4ɬT@ wژE ٚtvK- @y}yub192xi{ߦψwwcAyC!P0E^"@8ҝ\+N./'mZ?N:\{-*s4-too~Iv4דU*v*MȤ"׮'E眰]_~9e;I\w|Pdc+xmW&p$K2#A+C.ژs I47#n .@?"U*XA pxo嵙s/FkVx<قd}~UW 1'"hZ:<>$St(DsԑseU58:r;i `G ޢS[O6#[-[UTGkm,GEU5&AeSXo݆ݲc 俧xtd٦lGC++S=sh#%!G3AK (I-,CMI3yjǨJY*%CG Mʪ)8%yuJZ)WOuPt I /aXdFX:RBKSD+ ;T MDdeDMȚ*s*@0&"eSJu)g+H ICS ާR?yeAgc vG^jn'OwWi yR*Oy:궾l?L&jw7Wx7^NMwx:6ym]/_ LoWjiv`-ntA+g+£!uͼl F5#s|0z3(`@a &FfSNôxs1GeAa fSl.b&k{K4|r=O(u /܌DO&;ͣժ(c1lT"sKs%e]'QޣcNI-0ݣ5m\t%  #){:;tRBz yœ;Z Gw=ՙ_յ,%X堯}-z_ݶ֨ې,U O&Z.٘۳ᐷ+omQ0ǏGgAzŎ\mA;u ,.P*604OvEU>Gx\N0$kPBHB1K6y[$$@^S 5:m6iώǤyӋʵLcwGPx=[pnӨvt\}N-^zo}}3﹓Ě/ثީͮe'0v륛}iTGA˧BH y 0_aX,1;Ǝv}2 C#w>]ןE;1G\,YuxE0*'畖j5Zq~' jJ%lPh$g$ZhM"_\D!!醣ͩ06ScP}~9isVZ8Q!e$a^_K_Xm׶v;"<BGNU^|]N{|؁ֶ2@Ft7sP#t^N/dlQG^zBQbQBRϊ,iU-$#|hK] M@0e#9p>P\Td*UgLT%qe<[qmz^v[kg_t_bQ;MpwhɌ)\~bǷ:4\ONZRt}WԫOXS4^` \`|eJ.oz?=x~qv} BuV .dvyqs\ү|@:vJd5_dqyƷ , VW'ʿ3Z/u;V*ȅgc77>LfwV TVkKɫ_KFIYb0 ?PXc`{");ͽFz @+=߫^_k^!>d:&!r05xRFa_aPjm{!w:w;t6!wz!wx"~tB{;,%;C@;C@;C@#t4C@{bhs141ڢBNB!r!G$bxepiJS?īwٹNjoD{nHz抦y51%g5ұA.L):e!Dqp?qzEWtj`(N *L&80j'l!X,{Ewމo./ƪU*fkZˆ;ymCSsvEݱ`ϥꄣ1MF(iqA(_11Y!,MhS?c|vG oGiv1VLV ӽOëa! :9v ta]c"mq$He:]X 9B ; %uTJTS~MC} )???ʇӬׇGOesml7mGbG&ۥt52-lH_~Zti!`@ADN[y8x?=IݡUDžkG/?5y?{4X9αYkX^_Χf5^<9'U6uEQJֆ+xD%+"NJmy2MuqR&eXo Ρ4A`pqf<; 3nNBf39yQczgm9mx;ASt ԑ>)_d @bE C2g +Y_xRɱT|Prr ٜ4@RjFel6-~.d>ma OOj usm|'Ͼj{vuol3ᇒ-t [78&r@flt*͑dljAH5@U w *lSr()1klMiApy*VtjV[u޾Y'][]oɑW;RW[@ \`86pXcяjgEZ^_%+[LOO=b3.хD(5hXfADdV;6>@RL$EQqn#Qp$E@Aem-+jo۠~^sX(*[DY"x-@pT+Νt![rBzƴbAByaV>w$"p^ \D2bgB JԨ1R$ͥހV B9QysuVcleg;3e", DJ.c9HvI'"2E2B$1|0aSųw֒cllysVx bulōlE ds5i0$T )"K(΂72@l!˗JpI%)7ER\1@2\f0ث!=|iU7MSoHh7.?*NuYon_FyN>i׫&NgoLw+޻=~խywɆ[U%[M7g7'6&;YW-caz$+۲}Y-3_U;wiS̗S̵Ⱥ-zuŠ)3+?N꘶ޫ)tߤRk~?AX mkظӳAOA(~Igʭ;KIo˗ӕ7]̇|?.9s˯whN=; sut.wm'Ҍ8lY٨⽇ӳ8I)쌅42g{S=E,}njo,nm"MK@bX{Ql\1|4 <,پ}Ki}wy >~,~YJ+A^i)%=\.5ܚ"c7ސ$_7Slte;z~ow%@3~%7JXw|&5 0u% PK7`$<K1%]E1$_ܲdD1ބxBm5C/@ȜG0x,Tynqttb1@Oo<շJVͭoQ;?[֋BO(fZ4{?o_P-$o1vK05$yc ޯd@ݛo-Ei~>iKADs/r'D&C9d!FrW ΥJ(c:9$aiϡmaś0O8b_ J .rބW귋q.ƌR=pe;{s.hO#z z/⌅9]{\CbY>Wi #D>sX GIi[WG,R!ߙQա+#~N]rgG] +:طi | D.4HNߓUجˬBFz ܔ->ȥvՂ|uVi׏W;x0V,vOCoS XR m%G;mBq4$S֫8׬m$N'}#ACˣKωz41 )d瀐IK PRbݽa^1:KZjE!ЅR1G1GLxZzmkި5@s{Tg#~ƭPmiw{5Uu[ [$E4Ix.Z@H,q;I3d*K46U|ߌwzv<H 2&-9JA&>^]i%^Z|` cLg݀'qʊ Nf;vL1M~F GBN2:좍07V8!C3ր#Ҩd=n_x:5:Ox27/oI3y^xw% L4QE-8/Ys):xe"95s=G*H<}ThEe`T^[+Ɲp'IU;:ܺ=_=]Ix\fL1d7,lGi1I3%q^xfCa:{  0ʚ]n)T@wU2m^4Aph`п(bN{5n$w׌jk@w`d:$ f rr.#2R UJbg_zv{\bxYY;rRG-ڷ?|m?~Uu y6d欺𷄳%kW[v,󪰳I5!+u7(]9~)^ RmӆFLcFMPZ4 \B { J]%,s=$Jk'x(50'A)%UUAZen b DHч@JN69A>w*û2Ut${Mm9#^ ρƒfn?4@زm0M[z/ÿ=gn373/|G[ƁC'sjQֆٻ6$W i4v ,a h)sMlնOd0Iv-YYqaqtc^?}ll]cvu=o}{\QsUf!eӻ=zx狔'p@h%/ziu]W-Y|>-֡[cn=uMVB&["> ,x[;fV@JriHymh)%ԅ84ДuCqh ͓6s1T 4:OM!O+H a2}~ CQmeؽv4RSݖwp 95.'Zڜ-K00.6zağ6{,q|]>%Cr9H݇!so"?|QdIa `jXN1Nʟ>ZnWLz2X+0AI kxĻ '!x^kOHd!Bg^gL|'#8G(@Ho3D`@J˽J92`~0/J|Wq5;Z\|\iS" {1 0E#8a&IB}@#XJ@Up+ u |e꾅kaj!NIQ2}3H}AnrA[zjZJ48lH6} Ԑ,ՒX[]$48Oy_.HЭ3]3r&r"6Rl"T:H ^6aD)42Em18(\8Nj"Y&ͤC*FS0$@s"\Y.UpzݓUHJ(7*Im$ʠ$u V*[j/0dj28_=в9QN(ΪMTUi0o Z5hZ;5#:YlG#5\Wwx?-U˫դߙ=NG\N_~GR~L-F!j|2HCƻ"HppsxW{('E-*9GAsIq,tJ`U Jn)`Җp[rX-,&g#%law]|p1#=iO<5~6l/AmSŦ$0Z k~0UFآDPٗT^sRE-6<$Vwq&Jpb&Mb`"|L+]bqƣF1bIǹVڬ==XL* GA 7SRHB`Jh\^DA-tOB*!#2,LNGdi/8CRka-_&ˆ+|kya{[C/9, cKG# K$$ V-Y)Pr(S<(`)eTG&@q(VdEᜐIcG.62s],&%EQ.Bo{xpGK(=%C0)H 2# hB*&d/.^.vftkea{xf} x57UP\;8Q}D6{j K $F"72qVHv1V1A4&s*|ɑR'È ~X^mUiهK)h<} R/}/nvb) ֜uD tңsb>4 [lo|S(8\Ny m۹x{HLYBS"s4jgh*$'&EǩJm.19-0'dq8J ~1ٸeaSmUCⷘB[Yu ;..nl|Ú<+Q.J W`8aN<0(OrQ_τWhN#:Ԧ*mb_eկKrFYn;wcE2폏_ F;AZ$ F &%0DŽȳFXp\ ou(9Ӓ\8w\.rX%=~NJCq,ˆ7qp%gA;[*B-ˡo{'QeNuƲ zf<<{f2UR@b)eQ~& !4J=AZUܩD:o'6F .pgՎseLv")Jg)6!ʚv89?f<qoƏSOb)lfiS%YX6ztV]uɠ)eYrL-=Gtp"DutA h@Fo&A퇏M.1嶱UңeLȔJjY^H][S :bf ŨLRV#zr`txĩm+kZ3 z=U.7t! )v1q}&ĉo_,`;z#pV?C& FMވ?TM܏ r ,~VP)v;}ƹgIN_P 8 mf~韨R@(MۻPQ>'Yu;{7rlTrQ>sw_ǮiI\$mΪ\ zSFտvSn P3>.eg+O>෨Sշ۽ChDUaņi]V/h?>ڦ_&)~ `д%p `7Uyz.;6.#;چ=x ۵mIi>P"(LͽBئntЗ>ġזkI4P4瞐l̅D~Ӯ*0|5Ok]G UYm&GmVX~\wwUtVZ-hO nQۿѶ8pǼ[~%Cv5^~ԋ\9}{D,SPlrmLEǙ'nWkn|Y^ds1FJRĤj"\ gJwKc$x$gJg qTYe,He" 5[ⶺ՞?n楑4p)Q½"E:eŌ-9[_ީb2qV6)\g'_˞XI瘆dɠ ]tzGio Bs]F>3w5DFZS 1PKwQß3&sTm]?_I'ޞ+~UJ\1ܳ/~}>j[Q#^] +:طYf%&[R |yoOW ~,t)?o7Wu|yϤhl,P Q2CGCmtB'(GTG^KFE͟XBӝGK$_Y-ZEiF73#9\‰; ]*SlAhUdllL;cQVG̹9r-htQB 3GopHzLE!x΍Q5&`  e6Xp)H!E)xf)9A>'ue/tSd˂t }'uȁęuY#K4ď)eyq,#'RR3WeT8^J@dQe+賂)"`ptP)l̨զQnb$w=a YgR@?Ϯ%yג|siJ1,2BŐ^ K9J鯨%YAQ[B`VyRDF&J<&-<>-MgdU#EǖBʥР'ཨDD$.GIuNz.)'gL+$-&t'!WHQI.k"s3!h%j`ɒo@Fbj/n^u 89'_g5*9V.rQvr;=Z&PI2&cd*R EL*\jig\<\<g-8VP'O 4ciOtA~lX{@\Q980e*'bEC+%[}C‚QyH(qQFTsd9ydɚ2y}UvxQ?uhmÅ01aΚlD9$ ID6V f!@fp|N|kGП5h5lɜ_4Xwv6[B9C'yH}^+eԉG+KWےnPؠ5amN!bt*y29`lʄ1,K<(JSϰyՙ~¬b ?jMIe 'H?aR !^x6%@;83ktub_$=+9'GiН˥]uZu&wŒoN<\.1lWk|: \sutNw3Vی fO/?\B_0~^i)%G\5"a7$G.6:˟iDjƏtۻ zf07i}hpq M5i3(o+!`jL[ Ǔ`ɦ(Ɓ[vqm@ ?|1.p}ΎZKPz,]UiyF:nLooEM(i-0YY6~&#&e* C0jE}_hn`]%{sVD&TpMq xZ>=v3o/Fw=a!6p瑌2Bdn\.7O6 ,l2C8ff?[9`^DA6[99.(%џG?j4qTd7ܠoJ/eK,C VQɰ1Q=afs{El][ǹJW(en6mRI]vJ.xB;'c]F>_I^C.B^X  :搁,S Υe%R1[ Psh6ȰAM|O:fR龟&} F 1+MzvEѹ*u۫_ÇPp9=-S=#s}gQą?$ SMq3D|K:sPGqiF[Wďp҉g y_`GU#WpCowIwC<=խ⻂}@.+'}0ZzI\JθZ!Dhw_mIp9E.e-W M;E׏=W;x0VLvOCOw,j ZrM0hWce@sp2 T a h_͚V)sʵ齾TFiAVFڠk3W8m)P^=H y$#F9Q2Z[ H2hEp&H3(E1J8gȩBucPvvm@4kmf9ޮ?5[euf$&`:KdJWlh#1}+AKˣ[K`i*51{ι #W5e\gɀDODKT:e)LLX2)T yʺn74p!4L,+>hYK"g8km<CцdM풡զ?$t"~hޔ {IKoݨG^OW\z nFj.#@T2U|k"w%L.zEZU8?u^] s.ȹ|tqȘ:AiYH>'&!2 yI :!%|L)cѣyX](sd^(!ڑzNy+uH2njK \z5yza*/Ji#Hh.h]v12/PgX&w0g2T"0qn5m+~3eV^LH eM[RRILyd 8WU>,{a(J8ij%wܢ1 #Jf%`(I|Vm:%x% |LR`gZ{ɕ_X|_77"rq`#XϯOdݲ%O0q7Ms*\ 0 U3gGѠEv&;PNZU"yC$i.%sqʈE$2DE+^&{Uc ndNK-:ԊAǓ:qd3祒C6 v`3f͵F|qQHh49M̦8Z8N+c^[2A*e\ rZPʡ RN EtYj2zi[{j[SSn6}OkMDNE2x1XGk]s[чE1fL:+1;eSsNZWc,IKZw  U2hFyxEJ Ib>KmD@}(oUЛLDkuDDEJFaHJU"oW֜5x+z<feZ;[)Wto;B_0f]B (EtY |G:bw"G@ 'Ax*lrJEth!y1ig%֠[6g)H M9B1/m$0B\-֚s [?ßⅤn8<49svXȇjjce72pvm*fԸT;6牔XmΊ_3U/VëgM; rFmtorXs7z K`SiFl^z43<Tw4 $l!WoxYbgԼVr"Dx[Тg*^PGcΑzfJ4_=tq{0܏eޒZg7[P05Qٛ#IJldMާaL)" >d ڂHa k,C³(6Gġ:m-qhsBڜV%sA 6qC;ēUBFDxi pS*'F"U"$5$N>$@)D ֓1Z*-s֚<^˧ğ_xj8l8˯ޏ%Sr8;1tVo"?-|Pd (R()y ʴ1 'Bprt?\nkul, tm&nA)QŒWxF HBZHĕS[/l8(}g3DpE%ZV"!&CX"I&npCS;BeV`BJX8P&eY3GH c,,:[^B]7 ! X'\T-4o:b)Ii$9G)1k%Kbխ0o4$qH"$%ʐ5(+uNh fG@SOg:_Kh >y:L:{2f[?Cs8 J"3KK^ݝ [*!IB/k'ߡRՙյy`郾Z6?m=XT|50,LƷg"ow釞gOhȓ݁-7ݱB/nywXf| m;S6̕eQ+ɼ*lco╮iP|7 ݮJǥo,(BgbXQBv^P1=;þ?~:=ژ AAaK%$/Qxhrk rҠIGt 9>8! ڎaleRjea+'Z7a /8B.zy0E|X)N4&^=X/7ۂ,OH3\HM-˭u~A?73Jo>[\-?\BE-mzfv7NyE/_zs]HzTv c0Z$b!=_xkȕ" /;NҰ9Ju1ׯSCkm%dp = :xV`zٽJ|o$5ҫ;mGz_ph)!zwN'XaԊg-(RQtb2/tb*Oɡ Qt!^X/ׅ(~Yh.gZ/Yb:zJ;I95ŲF\2aT~j e3Qz2QeMM5F1EaٞǤφٍKᨙ@qi.PJpKPM`u˄Y"ѕyŹ 2v F{Ich́2$诱On|*][S ȗZ?F NW-sk$ASۼ Ͳ#zVZ9G:95 ! 48;Gu676E|#oѓ.8lwjSS>>qw4*7>0nk6x󜣓SgaE>%<@u/VY Y_8SJ TX ׂ%E   GÊV<+n.ʉ3+v@h;2\ٿ^rՠ萃{(P[:{AHi*;kCQxcCiHj9*!x,<%o ( iZ1k9NOr%'`ٗigP/"ͬ>BfS)adKRDD?{WG;g,;,ɷ,V˒O Sɒc{-y]ͮz^ĩxS2TCR#i,@ KԌ{[X XBH#Z-%yNgʯW|̡<߼ߗizS z'<ؐ`z=+trYpr}l˙/xvq||[U;qQ\9v֢]p빶5\60)!7"#fd!M9 z3OԵ4߫jN@Xbj 5qiX8M3v›Řf`LNkƽ]xCc.SW췓)jC6*d-[HxsHa(q5"aQMpj[l]BГ/=dCP$RulĞ&vng/=M;vEm? !w(;4Eəo4c@89u [!S)Un9B0٘!%tqL^|f%F-W9ݱ9ՒjَK}Ai+"dD"!'ޗPRB(% 9脱bQ `1ud~*" v7R)x8z df}HXgq8b.^,ƵO{ӴdW\q1ph%bpcE2J4 ⢱Q闢:WcU.>.~,i2nxvs-G{[ſpo;' sxtqXUobu"4E- 5^pI$}It9#FjCo71%"9wH1qJNZDVH*L)Ǯ1SCzTeq>3ryڧOm.L\^{8ͫQfϾZ8sOjsj Pk@ḋ.Fg^yc'@DxD|倡_ GjHM$59B-%pfVYQK1 A2q S> h9FO 5A::z0M0% !|n݅y~yv zJ%8e歾<`nD=gEIk)w5]l80"|<ŻO7ަlcqdhPDf{vm6p'9/aA,.\~͋sdajUNÈԡQgM\#Vǔ 8xY0~%^m_}"pLK'r@#|*Ȭ>oO4|l&”^ o= z\|mŖstwzoΏ߭M~V"ʯ"7Lzvby}2g3(U '" .Oɺ3 ޠ#C/w뙻ZJxSۃ|[ybQs4G AIJȀ%J5%Ֆ:y/66&~W둣XjUSu=(VRmrФ!$oC@_{yn|P½Ǫo8뎣4R>N9d^ݟyeg^Ӌ2:;=WJotT"Z,lzhQ}sFVOo;_ xܣ5' ڳA3ёZۤKw%D'+ #&t1޶hR)Łsf%%Ús8ɧ!B5 oor7MW>&x/hP<1b&ȌBe1V!RmeQKè /ƊܭQ ܛ8;ʲzfNmv"j5q.z"%3Dĵ^pi)`̤5W9_]ĥrJ!p)ɑQJ;SPRJ %:yDнfnW#݈Xe#U֔WeE%FR2t1KzyƢx}GlŖE\%Bs* )6BhujA5=VsȢl,w]4 >s"5x?fH"6h깁l eX wU7jgiSq`, ϒRqH^c.k6xDM Li<Ζv>޷nd@QGGˌn7%a6C+#T(b@ɒ!A)`SMk~Ne/M9" pt^|c4PЈI)#/=ui՚B'k҄1S-zb!&_f_js*IB{5K'vr;cnR 1tAFU2TSR1mbsIk#b=7;ȾKld8% *+R.j/)Ч(\+'#,?y\j=MCv0:qXXlSaC!i{!>*$#,*  r,9Jez,/ WY#p^?ytקt1#RXZpjD" QGo͇H?fg?;2?RX+8B詩܍s vd;9A R6P)QZOw"qX-Vm}3t|yԷ/5??kQ=qȄJ NGںjLU+g7Ղ@Q F@ 5K\s5w^ifU*Q)~4q[<;Ko]fP߽}U#~׎ߺǽ}|0? ygpn|J ϖl?g?ߪ}H??MpY_~IiyRҍe4^JFMqfDⵘ Gu8eXyT~X=#}0VٿV.;qZۮ汫L;$_OWm5:j>׋W-rhu#,^]A]=Z)\=?Xa ѿ I)tG6VuM:2mf 7V#|R@ Z_߯ˡyY>S\ki [d:\?i^*yP%H֖L\+5ݬZ Ko<{KŷdAKK`SG%T[;+&שaɆVtpO+vdp>)m">2nO [g{Bv$U[-XKl4"7JkS |~.Ƚy(&2DlƤP@✖b6iV#]E͘1152lc׷7ctvrQg!S`ħϺ_ =Bci9h;lDMhnsG&M;ZS{T0,1ED97{fUkWIIv.AIL geև++aG~]_囓r]N2xhwL K3-oH'h?S')qHUqs0!9iU\T0sbK Wb#@N pN3D6fC3p;-cʧg 먱-)Ʉ4cH/iHLP?Nyj= QX' "#3z0xĥZfc suČ:0L;p T$UMɷ&:ʩllmkUƆ8Ut&+7/4R2AV%T gԌQš\`)  VTjԑU̢x{d)(`~\P*Z!uϮLD+vg`lB1*eC>7X$(c;bo9c?i_ѭ݌bD^|cVGsj5PŲԠ]&x?0ȝ &^̺^TP 3jYSϻP uئ:߶Ov^x1(u@KuY=P]Wҥ`ruAX L-ɠ<Π; _j`A#zPr Ҩh1de+f胕E R`5=ThEYR:[&@܊`mmK"Yԏ/X?OmڼӬ(aK@6SRe%)$ Cddpǧv?nθ7d*,hb⮆&Xb =Nd@6bTn~.p /룫*Fe R{ X|.#m60Zݶ{ganj@pAs^πB.ds V.^u@иbhdEҶ: Nr:PVDM R{n#: g$?Aj؁0"ep% ~ X0FaE^Y` GE*C DYl9i?Pˁ`yC:b8K4YF%Gi3Ko^5 JYUۣ ^`V`-B@̤E@ |^3M:[P`HukmZҘ?=(Ws΋v) Z@KqGQqHk PX?˙@Q̠:= s!??(hr|d$r19mFِC&V.u \v)&tf"H"j å K0p9.Zc.QDީEE8Ø<0T" ZB?Z8.vdm TJ=l;k*=ya)S2>E;#A)rYi|M=@Gs~ak^Piݗ#A½U u$:w/}Dn(s{]]*LJgπJNKq2 ؗ65ȮYw*'ٹXWM5T$KDZy)ZpG2R&`34t[Ml7)ȶ~?0jw6MIkUx!KA뷗j˗ޮomD_|^\eR5AK|&\mIQ6LpBK&|5 JW5%wN1,#N9>7k`z 2;|pt(jr$yzl =S6YAÚۜ9e~5< < < < < < < < < < < < < < < < < < < < < < < aX< x@pz)iѢ< Z $! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB<[&/I# \a^P*:R[|Bŝ$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCBP:$! uHCB<[./IucQ3P/p|B[ntygu˲eMjt>7UϷ}:O%*7e.+ `Ey+9Q&-jg-L<@Wgq*%>Р4RO\ E?-ܟ>~.w뜝Y~3+D {tTndg؜˭kp8-)uYiիQ[Bm5U, zi%h{;l?>v' h]cw]{ᇰ؜nl\#?GL ?뷻|9ۙrm[S: kFU%} ̴!.ظM L 8"d[JG c<qΧN wD9M], )"S=_WMd$3n:SY1O1& |>eITf .?l^|Íx7Y/|f$-A8yQ!'kMIe< sR&cXhKҎ >E ޵ @9Af]NK[ҥqpE/N4M]_/y}jj<aC?k4A:iPO{5r5fGϏ?m79 n n n n n n n n n n n n n n n n n n n n n n nͽ+߭ϓt6bgj7o<_.G'6`zt5:Lk颤J-AuUg cWE,wj%fo{. 8Ő 'z(hj!ˑO~*ޜ2$_O/'/ê=a?K/^!F^1,pHwK8O:{5(gۅW_YApQ&g e,^UOn>{W'7td@bc&NDnEtEd^,OV+߄|^|gK2 epnL)Et7?w^QZGϷT[&vč 5 FF{EI{()uws!u5#7pNv*Ay'|3y|s7O^.㗼3*/1b&ӡp4XIe< O$hv n4ger 3+^/ 48nG^;l,67ߘ#/ƁU 55t!~@n?3Lkh֘}8+U+ 9f\J]"t3au&tbz"{g.墻1~K@wc }['ا:բ:|~87634iiz3AЦ-2X5Ր?K Y5+"~79t[#^uO=^Ĩ 꽓<|ޖ徻29Yߺ7 чȌtsa}ufBn;'_-=;G·tz|woͧ<i2lmO=rʻ.!{{w~"6COݓb.Rwdyacnlw`AbI27{ dtܣn-mu] ]q!FV52v ]gSH"8͡xv<#m5fQ$LD/YzdW$-{Qc0>,ʭҦ\,k3WtGsm0*zgD:8Z8'>RŐo7^ϏxxR)>0E~{y.|nٗ}/ELWpqܥYҡ*RLѥ2d aDF%C0/I6)n"kۻ,3hmS)BҮFq2;=k;1G+>s>]қ9|8a vgk8h8bO#x`L՘h $yP$ RK_O52_IicMbj%.'@'r]9ut%WדKU ɍ=/RjcaʻbbL-`oL\]ٲS nБ2k͂>'(O U(]0HZ%dA1%MGb l]S}{뚕{(y$l2ȸϤbQ9s 'h3Җ JG!m}Q:l 0^+y跷_D6dJh>( ⪯eUme^8xzå' 40gܝnr\qV7.dz$@&LAV:[mH\,MdfLzzYR戞m[H7kj)'+t+7,o;~ͥ pǟjqr6!'3=2.Z\E +LXIs:M5x7qq䐢$vma&naIJ0q4ET;)[?S+mZ-b"4ao_fӁ;8r|Kp5`o(4IzUN `hWȅ[\J:k%Rc& D\ȂIrTt꿵6]lo7eL`uy?_M$IYO'.ټ)桬?9>.򖋿1u&a;zwےX+~IE<=ս{}F]^ӯGagH"oM$p9y99^%qBi")%c\-W7%huOg}vXF8rŃn{s~Pc:\`T[׆ mݞ C=>pn"d_r+sŦjzs>V)sʵ>@TFiAVFڠk3W~Xw7ĺ `'.ĭ[/z+U(ZcNƘD_RR3 1 "Cp %,%g2w;h,\W16c'!V6j/r ,>Q|d9!}L)+2sd@1JDII @\UQQ2xIM]Dz!" %J_ՆuLK }Kv(ҿuk]uznl~}q@X|!AdOv-ԗ%Vo<]UT>٣(,xR.p1$W9fR`NR;`1"2XUddRXqHkki z0H6QiJds`ҠPgXm8O#c=R IƱeСrtۓKo 3S;~="vq/htzi:8bvY:D?8X%bIP9N+);/mMXNu +P=8!p l AdK$229bfJǼ iabI[o^1w`62A # ,)ept!Fp ,(LjW`H$+: 45 Y) dT&ɨF}e<67W 0DZ(*#qD$^ Us']9@1sP^\A8l ID*": tC#6F% d$gB JԨ1%ͥހ%2"Vӈ=ҧݪ)|դX\TqQ8-QG`Y$Rr1F2t)t\D@.~ʌy3Yv_%⡮8<|{[?GT|oFbRi֝ԗJ5J_UB.aXk-zN2g"+HWDÎDdËDDDYs%&O%me✋, Qc`">&a(T y. )\i0XV*" .KAf-Eଵ1 ύ65WԖC <|Au)tux4q{P&$֋cd; D}HBJ|@nҺRO.z +bgb0GhCTt6Ҟ|D7OLB 9 f/tBJՍDg&1R@PKM(68熣t&(F1JW Ht!ծSIƽTm ӿvO{bˤNѴHh.h]v12/PgX&3q&Le)Vv=7f%)MBedl&JeLyd 8WU=,{a(vʒhϒ;nDŽge%.zտk˦.?_qR8\O.]ϭ|1]v [jѷVآRYq]1,;Ͳ_;h>Juɠ!gmLdr)'D`Ȋ,LkKtQ+QU'.yi)lٻ6#;RW[@% XZ"JZ9~ÇH/Cǀ-ǯaKg i%GoMLGf=e㩵XZ.$]ڒTdP_E|9vh&d?Wiɽ^uYDU9!qCW&򜣎^3sne@`a&tx[tv9 l#=CV쉩rdm?Z[~ވ>&y[`qW+IyM1ZnWt͆J$ i}\, OJu01Lb!3/';)1CNd%XSrnO%7, @E6iӁ EK IzR KJ8äEnO[ُgrq܎myS.(~iCz[^e>[ءP&(s<'@AL'7ITϣatC6Q4hP@.aK9{BBJ(**!c,Lw8v' 4 ;JRQ$X ING/"ie&)ڋ1H;IN~s0KHO.kUYm{)ݚb;LY{%JYQђ$8Qq5di¥/E"n,tLI,A*r>C*"ZRiͼ>]ߪgץ } K2.Zǎ>li{%uo WmƂw̶Ń*h Y :搁'HNe$ `a`V mlS+٤^ .$AjRt {W֣{ e}~"w۫ ? G-ڿ?mY %ZҭBta>e힀<=Zn@Q ZLlP`CTix6# Z!ExRhVwϥQy38D!\A ˜ExZE Jͭ:0 cPJIjԩAZen b DHч@JN69A>w*\֖s`T6S382|8{j:eÌoO (n\?`E-U؞ Ve{ӛ9 t=SϴSƓfέ{gϡܹԕk竱]fd Z@'04ti;Icݹ̈́4ՌZv~1)A&3/MWweM%˭379ifB\*AF< ㏣Ѹ>zZ&ÿ?/:?ÇegcAft/dxSA2ϭee 3A F18c`)|l2Ĵg\&R$rLA0Jl1|Ŗ:8ʣ͖cP8K۝dXy+Y,g7\A>=/;^bVkp BiZp!H3aΚlR*aN"*eeWUU=EUF)Ⱥs { 4ju>nZz%Zo Mtx|&!r>KM5*L(X$.КkK(C/&JНD8D(<#N0t #*f4QJa:RYdA( 62b:ʮvs+!NlJjBunleg7xo_Yq.(!b:hI@D IItdDaYCboCGKJ1<&ug$Ϸ&yTΌ0|Yc%&8Z$[2'Mb9J lDSC'yjWA0(tމ/Kw[ΐZؠ5amN!bt*y29`lʄ1,K<@"!iϰ ,L0*³6CC"PԶe8;|-6xa٠/ oqdLuh >`:AB*ƣW %h ɪhhxv~'A!#N9 NSu?U(lMgPf ĵ5k7O:']ajQl4 4GGo٠HoXLFcPzJ& /v)huX|5|e(320GPI)ƒaJ*pRz-A" lpb+8>sMkk;Qknk8b[:Ioʉ!bU a jO\;ƔjQؤQ4=Evtjݻ˲>DD;?m+Ŝ_>[(W.z~r嵬v'd!E^(y:9KR&S)^+IbUץךڞwBȍFn9JcɌ& l]IG'F@-T6pY,5ˊ 'foͬ? 4I+nqIIKzdzzKxKJ6ۑ4jz-tP0#!}L)&UL6Kt,U)',$;Fj_ޙ2 S2xlR)ޠ)` sJ 9~+u`D06sYqp87xڡ|.n..W WuzX{OFefco3K?5iz2}=@H ]a}'ZYrr;zSZQ|:E,}jӞͣ”$WyHPcG.53s}FMq1cԭ ػwb\7z,-0.mOeP@GmyVq JK)..rr]I4[xz*+! t|ӎ &?Sby@`ɦ,=Ɓ16DC Ⅿk zB]$rcȜ[ܼ\#X,7=_fKFA x?|M 5 ی} a2_\]dzӅ5Aÿ.? [8X?`ܡUht ?)^؀%.c~L|Qe8l}r}UHF6'՗B/Kl7~l;M^4;, ֚Ĥl"\*OwK1ˬD\ADwٗw$q^jJ!,Ӂ7mU=ѥq1`^DA6[9.(%ѿƏ~VQg_ΎMoG{gIJ%XbK xP kJ_\_P)b$J+筭d h}T*4x]r}TLz{cf9{731)H[편> R%Y2$lV8!׫]0CTDN}pso=0k0i {!G{S&;h';XO_<1z6Xy"$,FS@3eœ-݆$?j,Oa:GBb{-zkL6F.p$Wmk%H>66pOnm6#$ 09' ԚB-sBN.X| ;l8Fakt.g&l'e_qDV|Ŵ&A攧唔4\(2 'UZ*⚢WU((Xt#Ϯ~(>)R9 ;j:LNYzeA ٷmFȞ\]w ͶUJ>^;li[y}){ xTjh|Id60\#gv`j6 s/^|%,Tr6}ǻG5GJ,ʉz[Yjl8HqS#52ÖqVf IcK'ovdk#=uwۀ'ۭE(ݓaⓓ[lQ1+UQT8oU"ezOQCCLT[j #6id5E&s=eO5TCO[415KI(V6-v>]ks.V{tFEG*.fR6jCr7|!/LƂ*E|RHYtA jkX|/ԫ"J*)J%0Æ=ܒ6^0"ZDlqEO=Zq:kbJ%xgL1J$LJ[s)>0rcq5mK$!RI[ qpD:]\_6(^0)y]t"-vqh"i*֚TlJLuq 5cqSv2.v)l;GIclMj(~w׮GTR|௸浦4-я/葞J]m;EuR7yHUej$`'.*n u|P8PG3PGQ#"ԑkT3kfF /7JtSgAҢjsH@<+Ȑbٜ5Y-tB)' txOds#+ |ŨSzw#X AOK^AP> ZNusy.;&c%ZrHbXy6 &WnYjjn)B ;A@&7!cD 7SuQ%e9 V((a9 ^m ֯^,xO5UN5IݢjH0R%J1f6 M)s0h0ͽ4xѳw͉0٦"3D.h %:JrJw )Vkibo= *z~e0M6Jɠ/6ͥɴҬ\-גr`m$)lت%k9fN^ zeHQ녢Sbc#fUVh:l8k8 #tW^~- %)vy,_YZ.g[Szsm4[x)o {Emo+KSkH lMX(QHySHK3JزmjlՀyd{[*$ܩ4&kϰ[jMb/Z(Up%۔lr%&WR5=@Y#c]jfpvyNI || [1 .Oi(vr}UY:{n{N|`,aM=,x*Rv8^=nUӓ +m jL Q-~cH8Q-8b|sU1dȦٸ,${5ȖCTVhzGM0F86hB Dg,·yIYr~x=H7릳3i-ec84{Lw%*=  !r+Ʀ J%%S/Rf*TkRh(h*R&M::Сm\Fo6 u|Yx-v\Ôx_矛7=@TW_"|i`Uצ~zl)ae/])3nW|}A0T֊m6DF")x T#6's;\#K*T{ XKvA,j%ZP\+uQa_kME :q9B#njWƩYL]pWtN)s2o1xT_S ^q޺;֖EihFUuE rPR`BdXTSAj09a4@CSQT\ 5+Ȱ]ٮ*IYrh!&g!JNzsR}-U[Zҙ4Ӣ,O,ʈ;cvJ3d*ĉPA(Vl|9Hqш x3)Ua2?gnݳ=l̽ƜW`o].J=+~w.{8]x9k#Qis ئK58(9O͊z7)GSJ$K4k#*ڟI_ݲ?~=_o>}Ym΅8~;YUv\~k|E}v3-/C_łQj@1kL&VJ%W%TQ] e5 A! }pf].jj]͔rR=Q/:: ⹚*[4O$o~߷|^,wKrHGr3Z~?坏*5 wmW0JBQw]x*g5'KXaRl,lqp־9~熀k/76k!o\8you1e=Ifoy_&=LKoАw A&:_N?S̛#jL  D`idg\,j( ɳCgi]5]aVIF}6Ҫ#747wFs$R3Z8DVAwmc , }1A[HZ_b_dI3}[>l69U#򬁋"A$\Som8{/c9_}Ylȍj6qq꼙<0 {/qjyD`gÆivG׫Yxon}nkQ}uhVpqa m!aȟ mC֓^k=-[o^|Gmvn}$-U6y™QJoo}u٘?yFߑZuUg8z1\Ǎ6p<=m>ι{+/â#6yRլ1rPBnxx9^UL*?]Zaa~ ͬ7]?ng.`"g4]e7FpsXߺz*p%w`Wz5f=Wo94(-7XνhK[-FvѢL oZ{G 2rՕ\Xzbz4V2\"?p_~?Ƙ/XŅ}~O(Ht"Oam P"ŊA׾_|Uv4!LfYΠx/y/8~]{KvJ^x^[\1ب'̊dK&e4i!:B(>SG!:S`LC0ue惙epк3K}KK26r֥$,&Dy:ir;0(bT¨Q;zJY#RںtUĦT,J db"RRٚ RTgIS.xk..VGR "c(' D6'݆"zNSmҩ2\|F|TSϨ(t|ڤ]%1L-˒-2 7*g tecJ&Rn{T$RDB1\`GԀ'i@%<\ ^',֞C)P+p:#ZI:r d r@A|? aqaS a!L50? VSXiXʐv _-=:M^]5t^Ck`-V] 6N#m/>QE9*VX[T$*8 )JF)|7h*:' -XP2¡NuZuimEehbn9PT W)Œ(w2ɲ| +k]Ny8f7CjB ~sD~D YQMrTPCjB Y! 5d,ԐPCjB YqsOTwn^.$~ynr{Ӂ{%D&4&0EA3|4ɇ#t Mҵ9;Έ blR-$_)5!P"!Fgrȸrۣ6GCEtթh)PɱN )hK**XjqIz&髸|N:Oo<+vZW[M }03~vlOrg1x4xBWo~~>v]vw3ngNxsۤD|_a’a YRR Gt@6 .KN/ X-[[@&\Xo#:![}* ]~Z藭UשGy6Y};olD޼^Jaf-@YI8}qhp+ ;sȦToŠGJa54oRV==[n=oig?6E )Sβ]O]#۟.|E~trZ@l O:ih,S0Ӻ|N\܁c!ӲφJթ wR72Bd=0Ԕ}L^g[N A`ױVm*^B 1CՕj,yp=ͅTT{ gOMIqѽ[\ıh'RNyTLP8LZ{"̦t&c*Y`!,HOyGy xb;Z @Q2*0W_v]*7XΊ\hȚ CP.}قV`:Z]g(zp-G^99bz}%c!פ,?jR:::B*;$.G!cyb5Cc`SKh\Aio÷n5wFM5ѷy?f7xHDH+(mѐVS5ZGRbg 6!rFKe.;.zyBaS&6֤&zWkz-$KfDD%' 1 *>ͬ-2@՜59xJ]*yC>}Yg6||;'M_J@>:#t;OT4?5dJ➅_y퇿l0r֎)8* [7)2BT;}9C8A.:w[5Fb1 R2`Z]a6@l*\Ye:GnuyD1%jwC͈n+sb+f^kMI#㽩|2IgX%Ȑ`p}du>FaȂ 5 P=jw*;SMN( &!0y9vx i-0KDfFDaDY d18a4P`TYH6gg|NB[S+G"hb5[I%U=Ԑ0iט[[oq^oU^u8_+ƝZg7+94.qG\qq늖elRk9 UrT\ 4 g,0eqql;{Yǡu*!u}J^r[~$݆-vHݬt6HNxk*nx% ٹPvI(HX9|0Rr<}oegDy, 0?惩U-lLB_Tͤ]4,jG="O/W1YgVF*p|>? !9'ÙL֚I% ,q PNp֧b?zv @QofwvP섗zF_w!MP;x3U)'Чb0+:VT29ZjaK^Z{g{W< Idj_-[2u-Lpw˃1v!&ljD[F0#{f}>ms;]+W6*X5bC_Yy~UIF%KVXNd pͪUzEha0%@ӞO39Ǫ3$L \&u5^e^xX i+g=|BU~6PfBp{eP(eA.黪TjŹ/2 ƶm^t|(kG`-]yv=qx6מ q#V,s966wy Rǧ|\OIiNB 4S^ZŚ-)K >ތ$OÀYԪ]%l¼ v{e?;"MA6ibjaJ}0SJ+p }t:f5xji{o]ڢDE`r[xY BF]EaS "(i|:E}Rʠ-(Mp\9ܲ,+3tSL2J sBTo3H77v|ܩA]Uai+/:}Cϭֶmt'wUْ(3f R[A:$~{Z(un4G4TbJ%TVЩ\r'Kgd[ Ҙ=#`%aᗅgd;zv ={A1;yt0k}Z9g|ti1~Lq g;8(azןq\(\<| Rg`l}[o,mf,dS?bp1l:/3ZR3:ɫØ݉} 6ÊwH=K}򷥘? 6fzl4qĎVkuI2u;\._nD׍ߪ׍N A0 ذV+9' q/Mɾ=!Rȋ[qq툇@xF=78 ӵooFMUzH>\Z[i'~^Д^-V&KS!W~bstĔY6kY.>|xQThY$EOhn`ùCy=pyX=`~~oI.zz w2'Cl;җ;eG;M=\ܯ`~YUjXb>4xU֐/꤫b*T5[\/ŪQ&w,OY )YȜ}͉4I^ MStix6XʼVg'D䘐"H)t%+pI?ѿƏ~RxtѸߝ886k'-XK=4MlΡo* yDalkGI'L)s2NYR.$r5Ҝ#ԳßvώrPPʑB@} ]U * fE^X5qD(T۶J# I튙"1"ZM}2%GDq5s[O$V;=>/doc߼wZHl= }|U̷Q1Ac%#(v:5@1N*o28gwzZ;!ӝ*F'p!*R\1GUIRJ&z Tj1r0c^! AcJFU\U ĚbP1DHXlFΞ^!P'auz@)SYԊ2M!@%.H7KPCFS2CF\2{Ԓ)ɹgƾZP[PRҟp<}6k#ݨtnKhѨ@֣sUHsV0vFݠO`q8B&﷗G]͠; ]+O_ o+xxN/ӑIx ^i\{v6w̹:xO:gۻOY+xxdq[^Ө#ps1) ӌϲ#Xn4W8n8ռV\RGY=f(a/Ǫ3$L \&uwR-{V/Wp~ k3,o& 7Z6iTr]:BHU枺献dt-?It@t-St͢4X Ei^u9gC| VUKޯx -ta> r=cqw o|jk%(.AZJ 2I+_l4޵=P6eq""H)Ҏ^VɇS' ڂJHZ(+5#gOy>Z_իu))A4)Ӯw˻ mŪ)Z- ym'[ee@bʽI\V $sNJmti3@,c*(.McvWUXڊKNtysmݢ`Ud,J猪Y;OftvyQW8y;G2,i$${oMjk+L!-Diql&_DO AL6 \Ud&"ςV_,J]i+6 =|R +Ut$ɒn_+ ,< 4.$;|z8J?07 _ ӳg8Ӆ? zծ^ۄY_io'<p5ӦKӏL=d˴^x?9-@iӻ42DvW,l`rF8g\dk@h}ch4c_o'S?$p5paYђڸ1= Ge֑N^ Nd5&_VCB_꓿-11_?Ћ|,gI#v4~ZKrr$-nVntUvR[J}iƆ5:Mo\ }Ή =i}lJ ĔB^܊kG<*_ľǛ4zi/Q螮{+0jCb߈ Q:ԂH5̿>wmmHy1, 0m`v ^c#d;A-],)Ȕ-; s#}d]~ P'yG>ѠNꐟT=Ztnײ^nUP;i_~5xt%/7' ]؁%-L?Kҋ1 U㚰 ;xHl/,zVsOEv''4U^dϬ1btc`4VLv߄K\}4,J\%kQ !!AQ@hK \{\oX=Ƒ&SfQjM "7*+~2Jl]LQ%?f|ӛ^q;ocvӏۛ! k/]&aEW q8W 5]%27'.2>LJo2ba[b![ςFVe#Axv\OTc/7]$o9d,\tj-ȝ6% ZbLVV [p}7b2ElPo18jLc:9a) Nj֝}; 13yK W>vOj2O}n?fNWbC8y (ҹ0!&_-}NN4*%d\LJz #j/) >,2jwӊ ->OS#:JG^/!`[Wv`@`>Y,F~H np0vbޒEWt`tϮol687Ծ'/v $_,PqBZ2+kc#gBZ+CZEL/͊3b> P*2`aH+B2"l{PׂWWz XgbI+pc,K5m)dqJ:g,Zm `M6 =!}ocֆnlٰ-C,| 9=KI7!+Ylrh#;!p`4Uyx\%TTc:_-HQ{(|?rŢ.ȁK瓥e']O\Huv"X VV'սS4w9WFPs<$_nGH8CYRη1DmAk+aمZcs' zm$[ҴÄ &y)RH#)_LЙ e#Q$Y$p.h]q1r/.'$ē0 qNef֓6 줨=B"1$|AY$(%C ]2 3Ay'U$_M MdEA8.u(K˃:dihgͺ_ȴ~\A$ɀ%A' xd͚< \ z jEϓzhiC(jP3ϓɊ+f$tF{P9^gY G㸪hWd3XFi[oḰ6+z/j٩w%dFaK`ǞDޚ6|/zz|5M&BbV^~N-g,tOLl 0Ǭ %]V̱gG G Gy:UcT""x*q dyOH k?u\DAnpIꊫ#gV<3kKT4'F^S[=LmSAl.xy*rf>ON ] ?̠KFkS -NҊ_6]_Ru8H݂&/ܞ`B]?  V?Mܭsuispt;/j˭=@}HbeF7-J w,mh;%P6 3:Ŕy 8Z=Ta3O:P5P@Fg`qeRNn#H#XT-2^ g0[v*IIn/zO2Pb2Oȅb4B2T):^y( ݝēt3b<]wf_Zm謩E1sdfpr19ú7mR53ػPlt"o4ɣ41Ey1+gGSi-l*8BZ h\{fM.&^KlA,&gHb&JFEZE%y{yfQS^GF Z_LX :,%|Bbӕ/יb%~>͟9Ra6e9jt_! ,3x|>qn~2K_[+UW3̙d‡}r}8m6?VGBFLw2wh?]6}͟yÕoX«bNj4Tn?.#8i]ODN>Uӓh9[{[zʜ"¬ R/qTaU Cs"C8N|&[K/yűOoG9g8EV&r_^[8-g- yܠ~xxy8S۷6qΘ =֋Ӎ"O)g-h%5},v7X"HKVz6+3~HK&.Eq 4ŭzem;kw0呚<]kFcE>n{ +7rWsRoe~缔+U &1jlY8kBQY"Y ZxWIWEly[a=KlZ+$1q3֎h(Q>|G?ѧ|{w}(QġF!AxWoQ/"^ƀQe+@?Kx4C9QqE00QB~ RfH,3>ƌ0:vŃjhȢ4 !KXKuMdM!}3>V南EdzTqcך닖9~E}W7ǑmbG?vA{tg|;RU*ȅ֭;+ԡZ&r#3>jѽ(HWߪ+Xnj b7[Q{َL던cZՄ,kf?'wg痃7f2a8+Ye*7u?$P)NȺ+\=׊,V 3\qzeWXLBiIkC ;eZ}shz=tn>K[`֚eÓB1bQ=bS>bS8BCuڣ\Ɨ8  i9>rO=Dt'}:C \BZ\1G1pd2N v;8g֥-c-j|K"TrL)*фXS Iݻ _l齬M/}iء@l!=ٶTbV},Rb)TZE yQٶ5d6գ@W3d`!r zO`)9sp^y]Qf\h) I,YC 5&';S=CMۡ,#dM.@\\dD:I!^Wӥh){)_O2qQTޔ\CP*)S%6UWdUQTx(z8,6Q!Z):.V`(Vm9#c? IƎXS,|7baw,|XQnw)=;?8sl7E(7~''J[ ^*B[7cUbH.0[j =VЌX”4(ZJӡ|S5sUϊ|5bw[눝.#PPtں3jGIu+A7Nm $UTsC*ddJ1Q!oCW<ƺ7@b! 3+:#;*[gۊ*&1*bT3v[x830 "v];#"8"&$mR!Pȭ;EQ*@h"J bLHlc6uő gRI[}bIkrPnnr(Q'PCuv]qtEqqō-fPMRB5,m%SMȬf5Uhb0aqqx0^ұ+xhw=@XI⸼<ּlܪ?:+TuVa -7~t\Dm}uG-AӠG}hA~EflMzS建<|` 8raw%bg5N) :oJ5F֐zѤk }&TSXڠVG=2lV>ƪBgr I17W6C +ԏyu*x}Y+j_k0̜Ch]\M*9_p ΂-nH~4tivP=jϔ)XġEew^_,_M'=B ӫowY)+֡K>\m[ %6Jh0bb=:'XiF+HTIj[ᆮ'ןNMEO/;pEI~`kh-<}e2>_/NGE,=YL _\}usUs-@/͡pv%cݸVոf͸p{!ʰ~[b—LŚ=(l1&Vʐ8!^^\φ)]Jr7xރǪwOb(ic{ Del%ŕH]V 6 P!h?T,k%؎xOď홷Q&ܶ*d@w_ޮRvO[·׈j ᓯ8n4Ǿr\O FS8TK :iA<bbr0f[Rїr2sq>p ڿ#f E6фy#&3j”K_%jdïպ.fq P0ȣCIh3-{:x@M >RªIUz*-ۭR,.k&8`Xjf1M̲.\u!TYdD&k!ASrW;kz;-u&=bQ%@OJ*XwrA)**\rz爀sXψR P*h4SpPhPVȽ;u[]C{.y\-<^3}eG6qwy$_3SOdf-rw_/-p?PiϾd-d"Ydu(!/ޕ$Bi"3#/>l60h 3Fe IHY⡣IRRY.4lż"EU-j5Zqq6'mw0 Jr< ٬AkBm%⢐h%!&ǰ`ZG@Z~|v5rVa`nB)G:;*Q No9 ֋ %(]BEvyq =5"C[C Px5Z=-Cʢ%NTBI+8; TNyt`cz95ʷܤlr >LstI$!at6y%gtB!?=Sã/h18s}穙u ĜsHZ-dG_|3Q\(y FKJĠ^S^de/0X9*y=X _,fv&C\|ZՋ<$[{Wj!,J&m(/X Kq3gF+j sEI'qS|SrntCĈ40d!tEf Y%Ҙn!P{J˽|v˜1ɖ=fZ^c B,b!"`H΃ )E%TK12H^Q=Spvw%ݦ7fӻOy)ty=Iu\}ɂUx=x76Orw]K\LOxQ-X>{Xas e:k\@e6([.EJWu |j>{rmchFgSbDsC !T#>vAR| ((0twL;+ 3g}%oT\2:^hwƟ_ڄtk X@ )Ǎ|(ְ-|)O11DxW-"F{av=9aRe߂oG~Z\UVI/ɜ)~۬sit䗼d&ivcv>i`^ ?Ƃq~?O(d3?,|/zσWXy]zOHfknzhC"vۼ _~c&홼Cs diќp>rk >,dnQ$Gt(UvΒ;&{~&.԰I#f@$ 4F[d d Df|f"yD22&fKXq9s_H$xD! }.{SDJsBBBBgr24&ͺs?I_C",T|~7s؍j\bwK+W#LmSSoM3W«ٗ qZh;>YccchI8> 1|U:LQu.t2 #}& dXdUKWUZcuRPK%o">l!6X4mj۪Zr~zW.Y;oQV[J b1u>U G4M[i"&1K"S7_$`.5BuBn92`}ST,˔񖑻w:q]f+a=_]%` ɂ7{УJ*?}.4?޵ ~]o~B )+3E9N@)Y][;' -fYa4u6;o~Uy CuGGgzoJȤ8@;Z5A$ɧ 9t):K ?&>zq'LNFyk +ѤCh5#&b|.D2:!1Ktv li+;#]!3] !Q"bB¢eVħo~`Y(rLL Je KieF/h H*'٬;.&EXPka@uUb*yJ=ߏQӎ s u5ŬΎ DT)[",| jPFoGikg]^:>Ml Eq1d@1Su,ʓ{[1GR>}(,Pvʦ"rNZdN<6`lblM.{F\ O #!`d`q?5ղbԊe/hL5E%NgS3sOgK8JmAɻh`bYh>+nʄyV,bWXc̢gI/dIz4i&8̄滏"}+ct0@3񨴡7!>n|v '2fLp6vףREuB͉ß9Vc;i,''"[kF=˵6#Vuc5 l1 K?ש_HJIrȻlbQ+60EO BG2}ZW-TBx_^v~+ED p^\S&2Oz3 p΁L,fLFˢw{9ܡ7G!5 )&N3T҂a22Wҫ gw>s M5Y)(tx;YM%VX&>OJ{MncOS+T.{Lbzz"*,?h)\uR4w0*1JAn!x!~D_J 5d1U쑈?s"z*W-%q5:IV$o VtjlJyeL*uuzoKM+V>TH=Z2Y[R !usfeG)* @LlDPEBɠ*) 5|M)3V-/eus.gS"fᙺ!SY+ncŮ;#NXbV?'W!bh<ש1i&c/+Ź^|='&`Ή!b,#hCEQLxt)pҕ:V.}˟/д=qy tJ$ړYb9>H3.t@E KBBs4ɗl>X;+|KAikPX93(-P*ܒG` G@#uxY kS[k݌Gg$}pIN|V*-\>5Q|#=JbkPnSr`]o'1P&-M!pm0ˑGA&l v.RG:D^9zfALS=^ڲd.wILX)3=A+SZ Oﵲzyq|.V_j/8}+NnSYN ~n,܌&e Lp,<-mjl7٤Kݏqӿ++"ݩ[X5YQ|4S躃4nYw\@|αҜťcgߍur\wv}\bYW{4D=ƇauYV_q:?e{xރ̳CiThrnt[gxVTԆe(%/ɧ.?ŷ&ׇ&k'8Ū[WhWR^6)l34ʋuw\2n˅L!wS=&8%wڒ q a28|1\z{C̎waPP<901 &j :\?\\ 7}uFtZ8$uV]h磶Ek,?[BO8f^4m˗&.(wd|AxTE.В%/e eqo%'x灎#hw].7O4,ўc>Io!_u qr+'ݩ2c;祘U2afr.Y$>(1"!ʒ19okZ4μ MeNIc2'Т!D,=H#B0!x|u NΥtT58QP8t5(Rt%}bMTԧJ&UB+ѭ<v=4Z_OkԤ"ШTn|pQ& M4J+J.^:69n0ʌ]v* Wz])[:htR<=Zgrg5%@^9]}yk,&Wo{vt"vDp, wH\UJ)G8RMbrѷQs!-vRտg_jgfח\QpxJ{.O=:rJCZm!YVt5wÇQtT*ҾNu֥3*o(o"ѢY7ź47O\[Z)J!H )hkcx>M̵297Zz̩Zu֗ Ь͋^W(k#:קߋaxлm[v{=; .UjBER LtF4F8oztI`bO53ŵg!'fi^vF0I8 =4H)"j76xM5 >U!L?ݾퟗz.%yfIMO)-cJYSI&LI+y֨l?< {@5/?% 8 RVH-SDAtZx}߾%'Z!մj\&婦/3^@^/M&A r 2#OYzɅtlYXDJKV,FVVK[\ y}"L.g,&Í k2Tmd6XTjq .:)&bᳺӇŒ7fv,6&2=2dTI9SW?kcD!WED8#P80E<")L (FBe<6xs炈PDQ#67K: -W>f B^ьyA\d%$ QtB(6F$X.U")Shp+P$-r%vZٍu$\kxZg5+9ue\T=.uE,KdJ>c(T2E2U{@\4Ό2EgY:CSz+=Plɹs^g0]gKh HO( Sr0vBx0 ǟ¨d_grt1ZLr<—n0b?F{9OZYJYk; A.Dm~L>oL揩M 54o6-vBt|5;lAh9`/?T1:Nh. bdwݏ?f/ N_q=/0SeӯU~,.4&KѨȡ NAV:+4ss\qn̼`J)'M݃ȨGBJ -fZ#xB arڙ9,ƳtN<&n/:!{m2uMIKZ1* L s!+C jL) !QxiB~! XPpD(<3z`80x=w٠lDTA$4Zl]{b @PU->);rGdy'~( vP omf'xk_xxFRQtRde.,)[SDBSx#Ϸ<:gFpleR3Hؒ(xe(Ԇ&Jw y OWtz8.OoCpenD1=p9{ZdsdѺc86>ЩgX%՝ub?jBv HQ& 5,+Y/F_vA>[_f)㭊J͒wmX~d'e2Iv0 3$ ൭-9`J$Lrw%!QX9" Z Pe#[ў@A0/O_4"+J$Lp4g7"ϑ`YYRA =(َW/h/4ɼll]k O/vˏdLB] [uѣUvKfm  SeGR Ri~mB7m$|:W==ʹxЂ&7EL 31$SI8`'\xJ<80Zks P%5ug?`Z[?L?2e4rė&*QX1!@L)ekwV6fQq:YFqm"=K͋.p{ji}+/KC~w3ojmiMr.op6XΞd}!h\9t?o0XN,ݿٷ훳a%~cHg俟3~L>Pqp^o(sžZsX=^t;?)]'o 8_A|7l)}qqJ _ g˂fP_uyv].2_w2ɍIUO`?l?l?ll1)Pm77QnA#W4xr^[ L=5ar61gl\`tA_tqU6hKg|_ Br>?lpt6i~\ݯ?f4:ѡyoc^l/ΚFSTJ FQp;OT2GkGR<.?n?s00JkB]m3IEOwwA%5*4uqM _~Z bnֶall1sCdg(%tsc@|i/`7ѫ\Ǔϛwf$%wt$+_%d;0G8C/y{^&שc JwJ4*!WF ^y)C3 ls əGv<ĭ{}ZxK=i wAb" 8ſygLȅuX'-rV tA+3 ,6g˸zAwqw23lwXC`8:ۦ~>7F㏣=-63oOix]-jv'&mFRdJ{H: r]0v4H βM+{' Ȋj*nu" '@PHzvAR7y (hًz %m*Al0\q} l0 ģnpqty6Z}=3GZm6w 㧨х*d  #"Ç4h+8h\VwO1&} ]UQ[I Tߢ܂Baa<3TLv'yq1ID'ALJ!QB!e&NƧQ§DZ*e\j;/G/?n|@fJ>K~!KoK>qEfd\R7ل eRD :&]oEAoE6~Cj=>bRf Ůo=y%~%~*fw F8J{+'$ OP-8tkXb/sĽLPG%BP@ZA0i#1Phc"QM`b s#5yH ala'stba|X;!6O fms4.vrt}]t1k-xn%Pcz?ś*wvwj-3cf#مjk'wvm:Wv˭nmBEgWzz?>e~+l]n][9l?ohe-F[5$?4/)AJ3[|᝻<2nKj$`|ǹxK\iW'n%~`FvwՐd@)%aG~?0L W«J ^>AzKxu{mGR%%7ޕm+~_gUR%D]dd͉Y:[N˲C` "0Ss#eZ UBKiR)/jWúj-VϟjoJlD~F_Υ'iSĔHބFz PP"P$HM4D%P{".Ku!hT9TZ_\RቚMQnYjڍ1Jg|ڐEJi4UXq0JI RH1)%gqN-$ ^836DeFHZ4 n!EblDpZ7Z9'^:R!aZ4!R:>K^x\ҥ=~uRp5XNXDT9* Fs\Tt&eBO-ຩe)$ĚXa,Ȉ3c_0~yA9$ȈzGCALJjZ. 1$r&9[Soꯘ Z|3Gt;g9T0a6q:8quˀ'k]m"Zu=dWpusKAײ$at |BIDD#8x()Q R1ZA Ս5ٳ|&=C6svq߫ϳߧazvTԻv-N^{!UUq$PvTVX`V?f]t~ -8SkjuM`7APIFulu;Ee/iimL eL !j\\&t7OxF~);CP2seQ')EBh9cB#@$ECNq:#lx] mo¼ Sv܄Y}4-pq]0^Xf8/PHN]s<?vڽ>ja<`[g2pH%)Rʢ)-Dd 4 LBpiT{AZU/t>O'6F >8NϝV;eL; VOS0$t]崴hz2 AlX;+A4 g ՚X7EKKǼm A7 ۲Fᡳ$٪ʐ(˜c,0m4* g-XG5ZhQ:~P,8q'e2xddJ H+fWIi-OJ ZQP"EAD`i[$$< {yE짘MRљa H\Q R0mU.ey8n%W~֯ggbGsM99K(FeIg Q bDCֱH nq, zTħAX wPAp)Dq{e]`|gs/3SM("z7u8,{ⷃYb6g'gl#&+No2wև.OP ?qJjS85-gv`a!×ga@ ]~^9|vkZR&bU&ш2/.brXI%Kn8 uzbwNޗ8L>Ә4"zIY!/2Xtz>x[,ghGz?`7$ZL^ֽnx N%%-7m{@MNU j)o0LI-9RCݟxޜǑ)k$rK-Z~m$F:_}Wr=п}#|@%Qn?MljK!ˏ"ͭ-ϣrD,oNQbT-E㿬;G +p<Ŀͯٗ劔UECP)sRd۲Em@Ahԓ|U-q1Y_ K k#1хK_-tuK/G&+KcI@x;Ye,":ʼ;Լ"^Bo+J]'μ42AJpfQENJxb-x@-_appEOTlzgTEB% D,C%j1~LnE#*#؍~1gKcV>l>볊̛;˳Ȃi :V;۳FzBe+۹+< 'ҠR H87R3gpKh!Q #pDi.<s$޷ix=6FYyN_`q͕=~!]:oΞZ=jU`r%̫q:+Vl;c> +{ޞGxYή<;K]Q$B̠>KOX { o-o _47mOo]5op)zv]q:6y)rÂss)q 8E.AJׁk*⸬24hOFMN?{WF bnk"6 ϋb>'udI(h~A 6Ѥ:f(QBVuw_fee~O!;!%d%P"_4DmKM(%P20 xm$Yk9OE)zR'I2;[aD L^ehMeQbE\,%.F% CbIȽ3q )HBVu&]dAdgluY2b `8'U_U̓o/%##` K-2',a+rlG (irVm9=lx eZ~ dcK3ҖH^sM0ZŽKѢ5wκUEܽ_)z GDVj(B$ɒiL L GU j'~>`+Х5;М3ϒAŔPx] VrdiqMK[43d$$Ny+ɒАNĤD* *Xˣ3[`QIR9NZƴ6s%)%KK*Օr+ ca Br(vC@sؐKg ϴw7N1$bv$nS yc6n[h]FC8'`d"$nՒ{IOvNʨ5Ǩ,CFWEC.xS8l'~3dh/Iˉ ό3渳V̭_J/bٲiʩR"4Us(Ch@рdqFA-9A Ē=Ⱦ*>Iُ7~u58+viϝ|uk5׋C;汜FrAj({I B9dܓKЀbp.iF;EY$7a|D=#ϥg7aݷߟ%v{ZpR aD8,h\K#Pa2yRww{׫_LL}_˧;V> f|kuzQeE6 ʖ/D1|>tTgTҺe@z@i6J-پRUc~}҉avuuNF9?w? @f.^Mb3 c, 9f&aJ &#)UUa*»BoAvbu7mHI!wVO{5P |;7go3˿w5 , 27lO>V1;J0>Ja|Kf]2%, =OK*zO 0'A)u##je6XS.C:\R#̉ީK rz+/u󐪅c[ߑ <,}<|}pv:ց]AQ|] ڔ>',pef I_,Z R3JE\ڼhk_a҅;@ˠ ނ nᅩn9?At3鎍fKF64Yqkw?Mh+\50?zWx>pbÚ_{7^t[oܽ4_{[蛟7\;^&OAx-ƫDrB{AѯG`nTWǔTś`8rNip!p9jm_8M%_$Cqw%f5v̖~W]sh9T9$6ML/:*ϟGuH{}z<1&L@2~ۙzc7yчtǒ7}k!%Zi"i3.B6VRέܹ,wJ ] ICLE=&ٕbsI_]=JeW5K趗o>HiY*DN=c皐v k˯iCݱ#ԝܻz.0]$<"t/…euvu+N p3[!/iur} :vIqJ͗l:ZW[ϣk杺mYD`?6*: f!!0ADPS.paC&1;**Jp`̟9.noҸoո7.<>2.z.vwg*~!@]\^ПdMz2ؼ觥X&YH&qZ&'S<#D|X77C2mV2NwՆ!'z%6ܺȱZ5k,5hGk'K-zt}&Z_ܴ;th>2Yz3_ L2s~aw߮ogj$#J5KU?~E eH)Cb}NQTLSO]xԗ *Ti&qd4k׀qN&$d`"3cTiT-GO3i^ѳK7Y?uo=Y]+T*l+4 ؠx:h/HLD XE<8W𾀚YJ}2}V8Yt忞tˏy ̞-U㰷7w}RGŢm>B1!7 /u` @_g~>~f~4ЈF `|X=_ktkVcuka.\q>[@Sl-tDͰq5br9I**#;2}ή4\mf}/eo >~'MV?Vuhu+\5bc7{!H[DیQIP?@_;KOTgt}X(< `tAխ|_9mC?o2߫1KVo}:ϭ[3Wzԟ ˿-[׋v PТF?~knteǗ/*]s}Q؃kZ2^\֋?2'8Ul{.j瑊2O+.VכSLlXc!Ө܉I7p)h@_BEaEXFw̘5 qdȫ:8v𖸭C/Q;K%Ԛ%<ae6 oT!W1"l\L5?$>f2qT PK(~$>X$K<4W4JME-ETKY.O? bǻ$۫LT:툄B Y _9l%mQif+h9qOܗQymIt.:P]-(6%xA>&gbLVVZeGkM/,\`Ȇ$sV8bɩՇʻ68t\!ܻeWU^{ΤEӠoUiĺ =ҭ㈎ƹC ~P18"FȨmO1eOPS}rߤFR fS^+UyzY6 ӳ;>"+>p~v/> l{wtKvėjL&8V|mZ[h7vDZoʞ iή'yz/dQ7 D%rYlTd.ػ +RI툺~*\=jsKk^r6GT1yeB3" cp!-L9kkZ%'Yۃ^פ|syM2z8>xxl~zPH0w]|T_Y~4#;)VO5igb?YBd=F(wD͎:VgLihr NQsim+ `g\B$[.{K*"R#fkL1&&g_k .0Cp@A3/;Nԍ7M$֝&{HĻ?~ s5# d9%E)ٰ>T9vp )(Y()mMH'khրbV&3"#Q:ybdzMZMg澕[w˨>ֶ$| (;r.D%: A;}A?/IŠ ")&C4>P1h X>+k:6NSBW|J['tð,()o,+ /d9)* Bt{D#k*T;Bhl(TVVRTѤPl뚱f^; SYT6n 4ȆRa\QB{vKI2ORFe9[VaZЙ!`,QF[=)9ĨVi!GeEiM_%z':oL@ 'P3xܮ{5$ހ{)IQfҢeKFA +2ˮn-u(e'He'ɏ? E~#^: NmLwƵe)!A*Dto7wUfg ])ph2U(Oӝ;"tىE 7wMv7j=(hA0@HO:ĥi+H~T!lScYVh@)dxdƻK"Mf?«7j"8B7>d ɠKx?Z)]ж (}AH?eƷozF4@3ZA.\`ExaF Rgr*iDz0TPǔ" tY+&xf5~6zX !<˫>薌,v:8YT?;굁L~^< 텤X3D( FPEۍPfèMw~ݽs@&'d4l؍ voȿhO w_-<<0.(ccm3*iP4 Jq &83#BJQZEVYD.)Ǡ!B R={]~̊ 잡Cc7zΩͮgqw7CQ?9OR1->gwwϏs:3 ~rv?~@>3L>/Z.0ڙzxˣ ?BC=8|u=}]洞XZs79޸{8i8ܬF|syW|d0'4 .N!Hݑ1h3J)"[0qtOI98']I׃pұEr2$ Jqck{I|X(򩲟.^66@@EtjA)DT QXyzNIҚt; <$Hܻ%[Mwnx;G@2)5a=]STa&y$F[^unWN׼ծ\i d&]b ,["aG)bg]E,F {" dЀQOD@%h6Xj"cTB"tS'eIȑC:eǑ$&Ph#ICP.6:$@J' 6 "\c l:6?hn0^*̟'Ջy/);z)h6,EE8ZvO3;'Q-sq~ŧ|ftvq~zκkdqcX}tj5zIn)\)>;?(ldPҞ d@Ā5 za/b)*2&2*u4::>"=,rzt-iTUŔ*XUÒ\r'+3C?.Q&V֏ji6 L Mg8g糖s?Lf<%yMpa Xeثϗmdߗ Ο&/ ŚTTW Α$a',3|)9?]@XDg3pb>9jނ|]Ixy1{/ &+懷GVhlZiyKl)_O.>5+|łޓe-S;2f[דngM1uRo-z~jfc~HnHhuȺw?&A+-̋~슣wؾLNKLr{J'ԉ 84|2 Oð|o# @γR"`qY9ϏOQrK{\}w{z\8w>As8_G̍叫۷[]EtsZzR3?jDn ?UBx=]G~߲yN|1ȇx)~2{8lgVռsOGYwƽ= "w7?lRES+X[9Zɛy)2q^ʕU\few5B ΚP\օHp&UsUQ)tyVm{ϯǘ8nH:`e)֊gfEgrL{Čqk(Q^_5p%9% $Y* S(mZP)( ;eZ}Ch5HJṜOZaֆi{ [^n~z/D{CweA-t.0P>:~3F@PKρ޴ټF_wD(Ĝ:C \BZ\1G1pd2N36#{ T5lcz.Hl|ˇף5e3[Ih:UKFDh V*k1%C Lݶ!gQg,B4&Ss)RZt dR2__jrhk V\JL+*?$9GQ)zSr +Cr*>hjN&m9Ihd)nUlA pnԐt\| #Xz#cu(zdUJI LNE؊/k`|bWu y2pNKj(R fuޔj W!{.{c*3+Ycr&xy:n^u/ﯺM#Rp'N:XB-5V_$46\QF%1[JtL@]&#"-/x^Q.KD>Gla%bW!0 >hr&+0v;SFx3_g:p66#촽aF#y]ytM`e慑{}m~L>Y+j_k0̜Ch]\M*9_p ΂nH~ tyvP޿jϝ>೻ġ&,R;;KOOPlėzxsA7EDlpȍsƒt@h9frAχ0O5w.MCKY :N`&y.Q w*#b[򰉞ץ\񇋏q֒u5+M?¡' (g/[;]vQF]Fj|tZwOr47#>b8nӐRo2؝nrmž &W}{ƅ'-Lαwl:h5닫~ƶVyMݱ%Em FpKzoNerʞ W/BpQl<Ӽ u5+T] *_eYFMzKӧ#'( /lua-\м2[eFPtkr1so2˫K4x<^x)-v(l@ˣ6'D&ǔݜw9b-/z?o9*_p>ijmoNfn,jg}4>>{2'檑RtQ%(MX 0SP)&h/-1g"}vdCجão1٨lB5S.*ᔀ} |@Y+eH\mu~.,0a7Dz mvo^ q &e>TսYm(?\5\~ ݣ+ ݐ!퇊Ec;.h])m?,}wF{wfJM^r./M`g\6bK׽ovi}w=pdm-t&y1"5q1ٖr?$ܜ܂!6B;0/+=)"N~14ZxlC9DTF1l9ē?n'u|}-ZQjU!/0TچJMSY»]uхMpF. Gc@#e2\T5&B2jL*CUv NwӄMz~4h6TY{u<{E0_q\z5bŔ`;g/ @,MJ CꄚCBS-.h' Kţi'\6gSB`t2l3J%$|FŢJ!PT$зqre#Fc<<#rH<@L%CuBY!nm:;4N =OkN/g7r6ѳ<Ț,w>U 'Y5)/fvoڢx;N)Ȳ(MBهrVVL!UW*IsyѩRvTIWOIC`O ȢUdr!)Z쫫EU#:b \ҲRJ:h6g]'x&F%(ϺMgG?կf{`YU:o1&R |i'5"<8H%Zr Yl?;ڈ [BDG_8(g_ 2W2,6:eP>;%e5NFՅj'qҭFR9%mGoc EUeL$t g!FkԬ2E#pt&:V{oKf,ӾE⤜6@ѰUeC$!ԄB:7z[抇RnXPZ*W+m#ٲgˍ`fǘfj IUB%g>'ɶRkږZmUenD{"r!:CG5TlV"bv"V]iDh;  %lLstddא&? GdkLGƳ^v:^52b8xy4B]<}|ҌUOun1IQ6^AbW;ʸs9>Nz(x+qU3x&}y}o>=\Zʍ$bT&C$1KUE˪lъPsT^9HӒO1 &hdch|V(S[T-lW⬿@Vgm{R딱WsF^+Y,-{T?{,:tj˒V+r*~<_Wu3 ,bY\^[SMODӓ75CeヲU&ʡJjedR2*&WVY ë<*=832Uj"!8CC^ESh^d{~z/PφO%|`4}=ɒ M Z4biCߙ774MC_Ya̓A|.;NO}6.'S9 ȹ"[V?nZ}c}bikleuUt<qsvbǔ?b/I>o O??`yW^և k]\>J3zvM?VZژc9;>fs1/ojooo1l/aML~Z^vl{|>9иx/Bc+W~Ͷσ>bdtdE{2t9L=KnUG?x{r~HKz!.6#'1x4VQ;dǛj j -Ǔ>O|7{~kboꋧk3t^#}[k{" [: ㊪z'RQA\ \Xy|Y1-q8 ~^ۦ'N\~5+ 9h#{?7P"|Z+[,*IY-z;PW[Ӭf5z֊y{,-b5[`괖71oϾb8Xig.EZY -onr"hzזMFFZ#MP}?2r}I.I_CrS(cֺo|PFS**zkd-|~je*GEy]i-/dk*/|2ZJ:JaԭR :l|cZL!h#q?/fL.IiR GE5Sa4:I(*)V!ZIRoK7PA0cH7c6m"K#I>bgk0pf >뛮Z&,C됔20vaӮx7**RR!Dm| zrB#5ZErc& , \B$bnNLa=;9~9dᝏYl4Fڲ:$xyB )SaUW'erZn7'M=tFVcmyNTejF$dJy 7:b(p.߫ F`4VؔPRDVhr  IkSmo,l* QhXēRZY{E/3搔hVbP[bӮ,"7F\r^$l, @L3+l %DJPJEu)I6 SI*e»@4.ZmXoBXr E7(PI9+Cۆuqx(fu$JL@yUc-}*eIcјenX"\Մ6)c#d5tą%DC5ehk]V0ڊ08S8DxXaҼ>kQ Ki)1TUdqzX EW F#آyX^$ݜ6l&7S੒b5eXF*xd68͠ Lfk=$4 { E'KU(N@ ( _XYUL~gb="z]C HMj3x9CH4L[A qȕ"fGiB*?%QvdrI1 y*ED1m@VʤC@8NVj T|Mt&#%JE5nXYgCRP6h+ ~J )mFveJsO.uKDJJc8PZif^i:np|_mb_"Bc(8i`#RTP!BN`"ue;3l?xaGz:Tb]Qw\p %#GޅLМ6d-D^#xHu@Ky@82(}r,d:Prt9|HhtVsc ((v5Mkj,(t_XK(|"2H&jZ!T^eNO6L%57=Gd%YZ{[ģf@܆`m]K VfESU_ ~3rADlLe5Dw <}Xo7[tut|wu2W)KbTB[0+Qw@vpC*r]H_.$߰#TK^HS@ʀv1JpQ uY.z | l+ hwX fVh" V N+]KdyWXN ГH)Y~]icQm\g$0S"IEQlV=;#"kTEkP~V}MDJY"JH'r(Zsl<A+YgY;5V,e36HT 5i"*^ `NIc-Jw+ a:H؀$`>oߦJPa?2K6ѦicA[{;{ FP?mv}Z}εDӅE#)nnLJd3 58)Qa)QdPf֘U}r%JV ]b{yif0Q̟M HrVZTHျ!(A/QǃCP!Ts \6Zu~n@5(ΨT@=@e!%,`3J$=|"" f=5y^'ægc B`+BJ1#xrUpN9B|{?M^2`Y0 :-?Dd$dj"0tsPuNc-0'`ҥCT.u z193Ukg'&AH?4XV=P(-g/KI׾; ha\5gm} ZZ-Q[ioo6) AP  Ƭ:8F O+(\5JVyae3^\U.$ 0!#I rV9]W곱~xԇn" sʨ}T5dWe\FCtN]Xz+9"Z`4rˠvẖ CFHz+&R*۶zFw˶M٥4 9)5PiTs?DK=ANkES|>` VrVeIv7u-RN6w>lbK2Ǯ:T@=v%PRY*$@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X Je))L?(:OF dG2 Jr@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X JV))|\t@0W=3`-Gֳ* V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+ȋ S:"W?%P+V9*B>@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X @߶փZ?/~_V>걽^n6Ɲk-M"Ηg_Kv"`:󳃣nd˒VF:YuY.R/˫YrE틋ݹT9p~{-p/PφOIxgتZrgX^b}xprs#zն/*"yZ#i}9QY@1 1 +"[+tu0LGQ{ظGg:1w/ ID- ʡ𠁅Ka}!! k6XWFwmHo}ٻ,c!d{`[/ȴ4$bU)V?Q15&}+>Y-0FVd&xbj`$Pk4&Y8Ϙ3nuPXPr{&+5کq3˧{0`\Yx^It$2sykX}:iBϛ5'LeGPe֭2[uI^{2x3IL_3I1+C` y\-}e,ٟ穓_i+˾L׿ͨCbN_s: ;lF~nҟwlݸg_^ׯܼ_3t5TgXƝ Sowm b)WջXq$s5n{ĢN&r Ϛgqv.~0{OérQe[6% YhB'v'ӊbr*wWVOof3Z?WG%aΧ7W2vU9.wt%AY ^ԍK 1ko1IS| ,u(u3bOގ/DJ.˗]`O vOv9~jy2C9B̹kw( nLD3O<)q" `GՆYoECˤ^,X:AFk Ka<罟VQ}=7^~-}tf-OwT]Gm:ᢧe47g=$yP1$-& q!Ƨ 娹4w=4xԳs !EPv\\%-k9@CI|5ճ i pNԳNIyj CJ_CNQx!AA FbFc&(PYd!xhΧqT*pP`X= I;T5Z H2e$ +^&{:sȍyA-\MG4#~2{ӻR15 /<٬B]sn Gñ̫g.)ʋ,l۽L?`K2n\HXGl֞'Q؟WӼy8Ce'\S7 ͵ :P8J;H Z//QYRMne4ˡ(_3mwOqdbQ>Ԋ5ۚS>!,)[S6yI+pTvc _$\F'Ͳ.q2{#jº&ldv. ;D;$&ol&W/puxţ7\̲EsM1LzZse.Bvu ?X>ֹy&a"P/)T2+GVMao2 ao]8"^2H獜i徸^Ln7e48zAp\٬yJ<[R' zZxE }m!hP`M5 5pk8|*}nwטp3pF#|f#rٜY>9)O~n_\tqWŢOd+=xAZD4S-CWbtm#_Z z5qE6q1V~F5nAu>L퉈k~;8ӡok㺥Uo"$ XqT; *Z! cY"\ףjr5*ﳕw~]im&G1hDzΓ(vݏoG~FFjLu1@9@H`QŸHB(?2qgl UĀAճLIF z( J'N;y.)cZn[6Ή;ckSk;7@BF}-m4ǫymZ%l1ief'k 2bN8%|% b9gӲ9d_Y7 ?-2;^)juR'o0( ZM1A6F &1mLc녡ן']*m֓S'287!Edb1A+|.E 6.V\bDPdV@M"2wkz$6Ge \/@@2Js禨0E}Y:zoqWhO7 yZ\]htRPŒ[x"F H" 5l|/'CsgOqZ> "Bĝ8}sIt]UH$sD"K3$j[ Վ`!g!%O ,8Py\pQ)MʲO)M$'J~=Udh'Ǐ8C;soZ;b$>WłQYQ brE JI0z ;! 1u L.>8Ӈ*,B٧lz?}Πa@7pwgVԮwdS^rg跏_>wxQ'#9{݃o:Uk=OtaZIw aN@z[F9m y1H;ebr¶d0t L_6P7L:Y Q2^hG輏B W+*FYxB >$ D2A=nƢ*umk(TJT> }a 4Z" "cWZc*#ך sRQ3}Ζ$'8PaI݅.kW7AI>,Ӽ$mbPghH NgQGϊ2"hЏNXjH!]F1mq$YE eRυ{gUI.^MiHk+!PkE0uP\׶:aA=҉. ZiWp-K.[xPF´K\rڂ,gh^]~v3ZI)0 UHEyɅtbyܫXMUPm6RyxhBXK2T|)J`s2¨ PH͆s2*la38'9YTdSke}_._ n:yҠl3:R8bU8Bo ^5}.Zb+L8UjMJB\(fS} DJlCrH) GZq糴G1bIǹV[6r#=xhqE-'nf>m$D_hyQiE0:-2!CĢQek:Ffp$E_H;ma>q(|kUcF8ZCoP2(|,{Reќ{Ab=֘ )qA _*Yڙ":&aUvĤ%Xnxh5/3Mu$7$_g3)9.vF8Ń-٠ER%'U$1sM@B$C!vK;[Iǹ4g&b:cPFяZgtsw 3OYy)J2Yp:+A;@g< ?͋Mzy`wyfp>;JYLrlrrr+mH'{4{f ^cƼS6Y,$ERE(2ЖJQEdRi J}b:h_guC:cpv%wٯzQ}uѢ 81lB!`RH6DxNP9R2M/D8Dȴܐq/z oDYP.F"2D锢AYN<BўX=u^_jfC\/6K<|s *DiVnrw[}im>-έ #8OR: DNAsMg_7(_!h(p1jlAKicVrCўk bi5'`k4on).9WRe`B y[#Cz#΢{iV/e?lpeֽUՙ&z7@@8@imbڭQX띷ݐ{-z7|ooưm=n&G=n KmrudosGin[W|c@vLg]5~') h-c$맣ɧ<ǘovqkoh;4ezPk+]wwEs'bo; V3B(hF{7wAoO|nWWm4p F/ -OvCy?[%u\:Q=]tw̨QEZ?;6VOx+xrHvSj [^Pi~"$b 혷1{W32ѳìZUfi[3˒'хejiLw6Ϟ cb!1EѐQn"4Hd4* OjQ.s!:1!y utm@v6 ^kĔ%4 ":GvF~:R8U)^u"2tSVqz?>(hu O^VqCۓHWf$l0BjA4WFU4%Jc.J'Vz9{e?W ^ѡXX :/WMhэwEN`Q+].ovk낎qh#a dcSTFXp\ ou肎7K3'bZ(0*0O|h{uTAsfNGP9Xuxa(%hgKCE(%59}mW`JVKt }4L_B1*>Ո^2X'5D%#*ETeC= 8®`$VVnL5״vذ 9딘R!B,j)m0DI\(! >]VS1?≐/eQ~IAdp'_;pf/@`qqkA_Kپ=VHDZ1~z7 qTTv&{; !a1^BYh/6K[/g.G}[ϳSf>7EΠ fjmE$DέinяM-dn4ΏĢŸ~ߙOՌq/?~8 g:ٹhjuoObydR4ߟ|VPw}gyK2Lf/D Η]γZ8A&7{, JIO] _"hai)<1Δ E9+@#oL/ϡS!uOϡVxEFlֿx!it2琩###m۶ ^@١NBi!nE4 XdhI1+z1Jbd*jmD˳{-)#rZhFdzXt爺Sgr^dO'NMzzUE6`I8i*[Rl^ 7Z fm֟Virs1h;WkOr5l ^ !jI `)7]/B~O%v$WEJ9B~ݒxuF%Z%l0^ 3l'a`qRHD/L z+ic񽶓is(Py,a;RC\/x7h(p1ܐAKicVrCT2еY nւ\ah}/iN;C,hhE&g.P%Z[<8RdF/4AuZDsGKɸ^9]lgrjb$uDcyO# 0gm/xEC]V|u>;YdO׀s#X88td[RĒ2ߍت"εr`0(O%Ӛ`+u$ 'L"-xS#0!}V6j\ (&RK3[JJߢA}m׭ 8 b$U$U4x{=gR`r){B! >P1@C 5A):NC.rT@ 8ds APgyLN /O}|P5NPԶ,3 ^Y7@b]Q0iS0O;E"<7.)l+"DVIһy;a} ;Wjjd ^VbWm߼~Rֹ~K~M]h0Y$vG%`=f적|F#32- YIL@s32R"i;@޻j1d!ڞ6IH?!O^qFGҭ{5' W0T[9rU6IIz٤cd4nT8-"J\-/9ZOL'0I_S"imC}y?Xo(x }I>ЧW:Lf?L}`3K ~%eZn?2\3Pn$jC7߿{s6{'|{޿wz=Oܐ~ϓ6ŗvc9RZu7Wv\YAb:[܋X7oW  ;d?4qmn}]D?.QF?\ugt}[GgMEGjNj̮6 jg: mΨEVj}w) WB;8ƻOez݋NY jU [sC 5%>[v>ȷiy2VD(L שuc |dɀ?LhUy0ޜJ9rrd!rwB{eOZDyn~q.[^)sd;\|}v6zW9GBPrrYeRJsf(wS[p0Гޠn~μx /82OuGeY(:wns@^|{i]obz0lmo6$Ռ$.lcxd:*ᾈx~N?pU"K^ms5p9q~O9_gnxr&W~p>z}05AV{\tje-6g˷9){jW`׎'5Y_]oI_1G?|x)HGlwc}NiU]$hkrSiY{[:$BΌnr*; Mё&&V-%g| X1S"Z_VG^YCiY"a`+V5 P&˩s]ǼpCGë͵:2wߨغGJ<87GEUj75snU#ΩT";۪UA']QcpnN%D/[rc##$ ]ϷY1i3!) ŸvҼȖFR5:$M^ml1K(ÊuTXLYt7H!QsӠ=ԆR 6*˂@46:)BAnQxitv(*5 :݀-Be*RGٲ;>v޶ql "(e9V/* wPy;8 ӮC͙(KX8Y'Z74GV5rDS#5m A)ߣ*YlXk/Eɹ m9Ԏ]s@PCx+! hA10zB܂qiq` +7IC22)"=&4)> tR)JV޶Tur</r3xWl\*#d BXGt zJaSX,∟&\"0qheF9Ia9T}` j\X<#3Ե~f/dUM52h*YR@Hqps$F B0@j(H`Hl N˿jCR! ƺ]TY {OJAW}/`WZ,ISi RH G!BY3Y` 1M2X[ @<33[Ib]uJHNOcFbJ-(hwD1!ˈsvlL|Y\9nw Y_^vx QI|(e P2*MT g_{bÂc*(OHv^v M*,(t]xO (x$Q$92axC`0ҸGi /1#WLd:[n-x#3x`>a¤Hp#9`;8m`YkL0۩qaM0>њ=wy.:`x fh2 ]˲w6^Ls@ T%B*(#F^lwA^a4KLq.aI@[LnR2CފFC/yA4 ADe#k0q%c6 lc`: I+V2PZ< Uz;SZ5z`&N b,@ A7 ;+B jZ( 0etRBM0MV `fx@mJV[U q$ܹSX@DH(5d@p͂}hd L kNRvٗqazFn:м/^av1l녙dg@"w@7םkD3 U`!ؾmg!֮͢A̵昴AγF9A5i7(34;.!|Ia;GvRD22PðZKY@BTA<rjoڬ'dI4*'~+IkMj \D!O:+U GEf(Z BZ*xԃcX2 'I;v \OjtKc`ȅ hҙŒH b Nkλ&,?t=슅Q> "Wq0@5VKd!~-nbpuŘ1Ȥ j#]ow~`-*_ `μ3:}.X-xz|EWK*_(ĄU,:zXDYbi;*k*Ѥh'bpN҂[\hK%I9eKBS9 su(s'T!NTa>OZ&O*1a|Ud,i蚬xn-ʪu`6ʴu <)#N;}2q,1')}WTab+$s!@ d>,|\ݏS].nz(xH<<#Φc*BrR&`n32>mR& j9s_{e^< >-Ql)9+}mg=MA]UL!U_)CY\.Ͳj3rUqwɯKcJ)Df|Y[YQw\ck(v/9mkNNGnpՑHVo7ym0FY6ԑ,V-NqFX]툕V:%#@n.pb!Җ,_ֳwnTiҴ68|l~]76vU&0P//]lWFַ -q{eu9c㥔:їէa\v&Jwr0=s}{`Gy/ŏ.<=OUo8B[$Z?kLNઙ?]XmoFǿp/..ۼ G[Yr$y<~(YEI۶<.dL$'wC%,9GbR1(C)˧( J.nUNMypYo9̥7idOHrwIޥR.u ڤUw7x[k7^-P;z9qkXͭkviui.uY[ ȻbVRpw+5+mw 7ba~X QhP`:O4/Wכ]"q/kf7v8Zn7&4'ťE9~D~4e%)( ].EyF9-Nc!GTdN,`\ 2nkZ4^XfO]G*ʌJtPN|*CV0F 1pZYG3oyx}jnWmLlvnv;X[dMlYM?s*yXz@[7]@;" 4>+"4MEh*BST"4MEh*BST"4MEh*BST"4MEh*BST"4MEh*BST"4MEh*BS?.4J{Bx?T讵 gM-@hGZQGuhrY.^nƷ0lhU]fyi ɸM) n bc8GG@Ds-9N9yc% TbrOyQN:RV^kEDF&Bxq( d4Ӫ&պsX1F>1[[g\ ~۸6DѳÝa(, gݽ>Y9Ӛ~2 BX~DgLM@>?}GîN~4#pINuҨF4Q':iINuҨF4Q':iINuҨF4Q':iINuҨF4Q':iINuҨF4Q':iINuҨF4?Nb8tWcx0[i[*͛첆fU wjԗ9LJ'T zzgzp7n yii[7\q)}NXOط6t|7|Nj04hE8x0fC< 6n{Żyxr?T_{$Alp!컷̓6P Fj ŭdWkm{; d\h7'[1UHS:gN)܉4#itGU;1MFi<ŏSjB moB(sITFb/m2+@y &Ux`-%ǚ,k-`4PgiFtTbBs԰Njx}Ya?N~ E zY7%z&z en>68 n\O !=p&/fkti 儮 AoP`օː{M]f z_OG}yΛ-_4(@! )PHB R(@! )PHB R(@! )PHB R(@! )PHB R(@! )PHB R(@! )PH(ΩbM!1Mݰ J&;{6R d|D6 >n/4..pcfqě?f9!͍45BP@v"!hECtJ,>EL@1=RPLпݯVNyb@ o՗ ckn DG)<䯎R J $`1-Uh RVi7$%ԨQIJxlՃgw ֤[U:wT0}/|H}C (v0zSp YTh]Pe.(S.8krwp㉆j)`֙ps.{1KYJ8a$ 򉄺j2Z; 5]4B[&ߞi(] W'9PK?[dg1dJkz}8b0&mr1MvT2l9w7ˠZ+pp;x!s1s|4;CzbKAy3IUX6g֝Þ^RV'BF®=/ W=*6+.-ܪ&}h'4rr?.cMU)5J:*a l1 * )RDl/N,<qV&٨s钱 U|.Tպsc8⵫S6zm` 2@aKM$JJf?PAD- dy^Da5aJ2d (ŠRGL)E= 1! VN=XX?'ggbn=6{lQ#p(СN=)( ‰?Npw&@3{OU.!)"H83'e2N@PǤ4g'yR?i3bPVE<#s!\)n)Nz‚ BT%o$BнRKrxUZb9v8ZԣPK>' h7i_ߝ6`m]4L4E󚚣s_t_ 6/dO_ں9Fw/?MV-9uuD;#o7SkQ;kSd}MZuT ! ?s}R{;[ z[.=8GZG5׀ou|2Vk'me[khmu!j[/hmP)x\}}شTF3ߛ=;>18'g]~&fOI[G۞ -mO;u0֒҉*rHSimy~:52< F&yz,3ՄF%JVg(B8 $/0W7H=;jUT$Up< 5A8  QI 2ӎ{ZcZ "TeO: ӫ uVml]ƅgd[R٠*#{% Y[JYube Dn&0noC:DnUZf#9 \C!!S)[/yOHޣ'pw`K@.&X>k}AzB|󍕂W#XyR`@ܸu1ݩw5q{.^~!ڣwVb F)֚)LZjdTdo-FYys6T'%>ɜ7>isYK'yZ^(S8/qB"+e5*#beQ6"eaTdK$+Nj9n6mR*ֶM5BӱYenسra9m~,+J&">ľ=mL'ENDdTdMR9BeL^賵B%#j$Xcú`=\Vu_9Ƃ,_گnREs͕ˡM/+wrwCeR\5tTe0-6`l!Ǔ+N}$Pv۠)8ΐ,' -]f,?`H0rЬr †2AFF\Z.#UjZΪuH;ks23_JZ`KGv8e3Et*xY[H*yꤲr\DIFAxG  hUM2;G}~oEL<7 BH`'V7rrTm(I^xXv[`U% I'̅01'`JnBѲ!#:I"R3g Y2Br* :*0I͕ `p|4mrTT07Kʹq2I<&b!'S1s-M,{jz+"зij^\9[~?f!dDx5dnZei͊qQ&K벏0I)J*K?y&LnuT!Zl#.D;Y$`W$$Ӆkp\{ȣn>^nhz3V j'(Rr.4I,g0/u[Ş_IWPEasvAHDx%9`B -QQ ?#MH6p pn%Q3GeýŮpq')ބ?{Wqg>ɛ5(l$Q;rLRɁ}<&[dqDJ$]Ur]p_ oaYK'w i\bol';KGbtalPVvh5ZxHʌ ;6`#*$$SEr;نYπǣ49tg_,T]}ƍW2g;zԬUh`r_vA+ N`ҋFo#4iHb&(M\^͛aw!_U3&]g]|ꍎSTlx{;aAAu߉t7`+X~?ifz4͚]U#MG]P8O?fJZ唆/iH񯋯*{FG d"M]~Z%Ҵ7z WOkl۬"YA}ȅ`Z5Ϫoټ`~%o/Z㴋BCn`\6gnu3) tSI$@kbizug(eK!VXe/q~ E%a(j`Q7p|%!%~)NK!}FP3꛿?} x-UcFE z՚wl^8yY5,ESX?ݟG"jE7(<ެwmݾ.@eU+35+19Z)dKc,af[ cK{CBuب^-J6K7eXMj6U~Sѱ?{csoek} GL?ѸƘLE Q0I5ɢmg@lco˷—7EuDԝSPw NA);u W㓅1uԝSԝMZPw NA);uԝSPw NA);O*;uԝSPw NA);uԝ,/pB׶FbFˎ{jәUkCX3((dQ'ݤ߰8ErSN4xߟ__HT C5Xh-yF:H*SԮ%Uvkmuv0)N:|~h.w|k.y LqYC:Ncʠ7T_Ӄh;hX+nq]7wy֬Յ瀽Yi} څvrxR5]R)?Tn-sUqWjߤUw_:T>Q\KNm:u:9R5*HӍ㿘)-wǽv-y89T~/Ee E/;Ny.Fx.ojI$}dPN٩ KFuGn[M_5CAstoZ#I#z-37|G" "jN1!D2P@'Dma^nI[pq- j@[v n-Pj@r|gfYnݔ˭ǿj3!"ckRbj ķ/~< vs(s;O2݉aR,0A[ISF `]87Kp3"5SA/%%7 qbi1gȢaظPن8(Ԝ?knic-NR8ܡ?n|v݄ƟoG <&ʸtwzdgBJdbm,84 & y0z{;u4'-yYM=0z8[&[}[߀"'g E,k&\ak1Ń)'cg5%p"A`/ 1`1AW¢}psxVP`B<UX-zƌLR36b=6MV$at<sŇwǻQ r~42|v+ţs4YZ=ya)wђo<8: voUg jvOs-ܽݴ5 .rr&#w$Chwש5a_ \CZ7OYr'{:]ُ)s5e -9vv>yx3z^kV{WsBuˇ\м=f6<|ޑ$M悶SW?ln띾Wr6g3Eyn7L9 M6˝tfX0l^"4dQj0Po4cFBM.]X-S}{ +{eqcê)h 0$a[0 1s>pGroOoQl =)p<"v)1{vK*C 1̪ˆ)℈jKyo00Ԓ-.Jw5u6 WhͼrQDB2[  YU?) 4ݮ唂8%Y3Tqc%WHzɘfB긐J&.I%) 7ًkjoNsM( R#JJ)pO^ 5[qap0A,Kk.s9df]ŝ/w_3LYvQnҸ~.ߕI&O8+A@ւrTcqcw}M|#P⽪:˾?xbK.]@~MDdZGȯ*)lȯ[5Ǣn@n h./i]-guשs$'9}ip/S-O.2p5[Xf]ዺ]^kNw%&Qf?o`2?_=H2+$ւP3 )ba~"~uN_av.B=ÇT]tD6m5"$ BDyBCc$yƲ(xi:)&LIX8F.3Rʃp-1Kȳ5ts R͡Z=Xq (2X`MҩL6 #Pi@ s*ƨrR^*m4U aU8R 专F^$p>ASQasbd`AEL`d-S6F1H rS bE +хYN,}hjaiN?\O4L=L"X %(6qM 9J4Wqp\|1?ƭOg&_vNSih힎@‡D> 8N;IRLf: g]WiH8+@h]rhghϔ?oG:)+y>lD;IÍs_6ݰQ8sr10 vihȐde0Pp5Czp)l'g9,K0aUՈj3gڬ SM/æuXC ql6TQj ^Cc \T_"7r~۶~7o7żHc}-Ag[ qȒ-d3>&,Z~u to4񕙶$5lQ{X=YI9L*(;nbS5wDlڀ R`ȯ3P`?洨RfKj_+eSOV6 kHDGI> VЀBA*Nr]Z$yiiG)7j<֑aY E[R.k%%RHDc̸lڔBTmΉ̻;'_\JUyP@t,g2}[9hhk BgG8R)} !&yå O=cF@1gz4‚Sїӑ!V{vv0'*aEpifz^?x~.A:hb=Bz=/$\i,{Fj@U(YffДB-ϤMlLűʛ1k4 U\2lX_gqJr~)A]˳@`sflAyqFQ;Jũ^p4()mpk!m,)sl焂-~&G7pysVcJ3Q :yPA?z9VkppQy(ęظ8 jxT>s>`.`aLʩR ˽)Q g?& V}oVX=o{Bi ~z/0(TޟcQܻ#H^kxtQ"o6:a:K0EY%P2YҁP R 飶ěU`̙ rKl/*da68PO琅S,u˧e;{رMş5`0rKl,tdZ n +E0HpPJWr} 7 [S{oGј g2HnTA*I4=AA@D}AJɾ3t$Z OƂ1jƹ4^G mF$rҐ&ڕ>FP:%X^ ~{itPn]S߃ߍoX.)&BMc 8Ab%:47A* $Q$B喌; g K$ 0i"(dC~iO֢!Q@ AU^WlbC}'Kg(05jz7FW qYv`dn<#Zly 3|_7.xPKCR+` ܐDRXh;SFƓ؉<ߚ)ΟQ%1BG⁁RISpD̗VdK~Zv<(w1Re0wuYc`bϻef{%gY 51KG\:zϳs>l"Tv&Brw]m]\ sD>RP)7D*|n/gt-vю{2?D6<wWC?ipuUn۫G3ΫGhe|d Oc샅nBD6sKy&;їD晵soÆeSbzGn|4Gv_c׺7x9?Zl.:s㧷oۖg7ۏ@HQ e(e 1">/\}qJ?q'c|OvL}l4:9!<֭1kUbamߟ~CA?}Cxr1\W_oofM޴2t~K6nDꋥm_}\g_-~m\˜{;=%zߟYmqء_;+$AzEr1cˌZBݽca6{܎[q[mx)³@ ԶM;"a=󯞒gohmlx2/$J8;vfVaoܨCY4h@b\xU= F'ƜrG5Qɨ$VJ oO X*=`i!tE@v($AZRiGhHV djʸ$9)L'L!]U5LnO%YG1xi.'/^pfC)gx`KL 5hf{3nFH 7Hu@UZf#) Rj'bŎw?gʩ8CRo%Ex|5x8Ԋfr5~q/2Jq{.C;&eJ,Lf7J5%Q T68KzI)= yݩ) <& EX܄} Xɕ0`t Q+a,RH=> d)ye;p~ >h.mR{F AupZ DJPHu K"SɅ354 E|IȅƅwϹIs"iҩMgOa.MO"9 <>D+_6˻3@obYNW yI.nؖ:.<eʱ3c t<$ÿE G)D\C[jWYzrlTq%ђοsOXhFQ>46S|{;j^MlbCQGL̛5=sGjG6DT0 &&27c@.6»h8"q` ;/>^dy1l)Fg.{l$TM\/$!yH.rug#j넒) QJYүmK1,((3+41 428Yw䂐>֟W) &D- [Σ:h؀: [ueQ\t \t}Vh٫b+kKCy2P BbhEbh#i'u`)(u ͫ!E98X*c**kuc ɣL`&Z3g쐩R 0n^g}|<&im~F/8{1,eL%ڕeoGp,Ah\örs%~X;?yZT,HB$C T*G(]J"A'kH{D 8#,`U{PRi!(ZB 09Xb lKSSCitvYܺL/3ߠ#.|A‹I$Q!7s>-ȹXa# &6SId:Pg9t(TpOXǹ8Z͌ppgI8CFÐsTBS% B q \FUIks"& , f-*$(tӋKR_ PD9N,ST&mV >AQ9.8P0`Zklt2t,P+ =3ӿ=9@2\ iUQR`yNMD xbevV1U+LI+'[*&SpqH N?6@uNI!S (@8P("8NTU)F7>Ǵ5.O4Qx6KʹqK$זz_S|̎;n+D6"nt}*WDNfqd?{ƍl>n.6* w@n~{,5#E_ bKV8#͸8**_-f1eYd Ig"d6"4.K2dTFG39tt`с_K2UII7: RlDOւAlr$b6ȪՒbʪCVcG)^abvVJX Cա;Wu/qĖnJJuJPv6]Tgwa%:_ M] Qu!I%ӏL?ZZ25"2[`g dSs6(T|+2Z*|I>OM JS0.so(2ePTWdͺm:Rpߍ»9{t}O%ٜw´~-Hs$꣤3ύޭ݇ٯ$(]rs¸s. BYN r77DHٺH+f;#PRX-@SmnbC)E$j>tK-jD6J9qJ%ZfMgE~ā.>?Wy@C힋'.zp Wڿ\U?^ʇ$L T.Pt IBE@":鹏d o`nN\ a6+^4AyɎ/wvPv;7?T* [wܺں[no|6mAnqnv{XjxEq^aZPm޻]|u+CR)-8 v N7yt>8W .fhPcR !q30`lIy.8LJUfJAIڬ F6Ӄ>rwNxBB`"زm7Ջ7[kO2}0+(!()PR) AvL$ [J٧]eYi{4^Ȭ)d19Ns&@(EɊ]I2Pti MG@uqR$eD@s(M)Z+KLKvS6󓃤7~)H<]{t_IhʻTsӉ/PL{^\`)e+F5g{1zz2T3AKIE)[**):|+㳸ɜqX۾.w/ג<'E1_R  u.@ppQFe8:FYCP/x /69#: VMV 6Ү@fq$&BsP:h1G>f(Ü<=TǓ7܆;v3nz{ ZG&#G @Řh< N ֍ |+|Dfl6 гzynȣK+/&״3(U&ʨKAӪ+;[ V/O~ԇǘEc{J:L8JDK9I@ v o>zZlġ]"R:;; N6K-@U&xG7лxa)$hLJƘu\Kɘ7W$@E!),%hZƿZ @-Ɯ=N*PrƄU0ॲAG(ʣIXϤ~i[ ,5Ĝ@ކ4^QG?T@Ab?T̪HY>|c6U/KVg,¡'/8?:w-?L-및u/n޾+^QiMw;qɤCMM&v߸~>ɬ>+" rAs|ݕ;=ϱt9xsmt{j&Ciw%8;b KqF-)=,ٖPv{8%"X/z|utC?^b뱗smȺzKX~~yˌj{t}C›=8]eP~/J- 1+xb:f&:eq4lxf1ҕOh|- 0у\7Ygߗk{$ѿ,l.w3/~Д_XWC!=ȸLk-RvJ\3tj=ʷuhC,#G0 P'cJHo5(l^YS@)c#, iI2xmBNȕ3{zɱTv5q մ7G&#D6⎊$c"DDw~1GA'Vo?_zWlSrX!ťDz5Mg=AաȆLxh"8́,cHJ LD'sm9h+/uz"OP`ou,th(4ht4X^fJk Yz2B 0D{{!ܗcPpR$Ռ1T"*[oJyI (zzKxKJQA_Hw%dؚ^fk:(QҀ]9/_pYl->3Y  Y0ITא_2VO_/LP,ILkHcl^AU/F?NFUZJQ2w?S!ZO8^3լܿm. !. xjSw\0& YB3)oq dl>!/x7^kVf%-PoDzdNU->-NMwN_8A~~p[#+̏י]k... =bPhZ֩3,ҿ<^9' GI@+^؂ +ik~Fv0_o 4K+TR۫1})y,hԓ-7zE_6d[W~TJqK`m"\*OwK1ˬD\ADVZFi]Lq^jx6@+ tq/MVi>4zΣv2+֤0Ѡ`Q,݋(w8g?{^M'MgcӿƛhovXvK<4+B%^'TKdP4Y5VZvUYYoKkJ7heCJ78Yq5Dxiw|9*(D! GR̚,0|̲LD~6t9bSuuNGJ"e $d֒g8km<C?DQ 5sY*2맹_|t:W/; ]itQQ g-#yB/vQ?1-8^)Ҋ Tnr@*7P}wTʁdF=O('"kPZ5rYn(O v aH!cvl=d!0s@'%;􁔘u?L0Q[c%MSKMrX](Esd^Y!y 5OKuig܉7b"$eCUsX6/K E\DÜtG벋y:78SY`,yXe蛱γBg 6C! I6q KTŲA&d_afBX̢ O*'q^CrJGgk!JRf3Dz㫦"}1)! jycs>'g$FB@O a0)dI ]wnk0/QMCY:]̗5=λ'!M;dHWmD;Y$ &li׻:]OF3t x7VؠRYqi_Cb^+:g :dЊ됳6&\ZH ΈX^"" {OũZ"=C@sO/؆C-ڔ"k1uJ=\b@6*x2Dpd9f}Hܧ{㕉<#aܲlN#1AM <TֻBیHɨrdk5q'q7`&"tެ} A=)ZW 򕘥rvӤsD`VH JARn`sI,(KzYUcP: b DHч@JYJd9 w*VwV@5nQ~,'7d\~~> = ^ktzjOϺَe5yD;g6.|M;zsuq{E[L%1 2]sx;sf9zt;'g&]y>jl' l:m=|}Wxs#C![ߩs͇<1Î6MwsV密zl4}>۫wnJ܁]p40Ӓ{,h}\H7\@hMlVO~487x_.T1f0aάl=EifLĭ`e &'4Y蘒4YDI0+2$*K $$3/څQW׽CuB"Y;xc3DuiQZ¦.otojSmRx]_r)xFyzRg+sFsdΊA&+C1{& \ ;Rs4Tc LNx8%v#r2&!d6Br(I ,iCƂr!3Vs1;y7)qawy!w5J4ıa9hx/9!̷GQY, 7 QMx_W5[iʷ_gh]t{3cscf|Uqy:ZI O8( y;IgqyZu?|[=4&tDZM.i_#% MoߒҜ+W7D[=vo=">ոA?/_j]2?h.\tM;dtwGm|X}觟s<ihCpis6һ;kQg'[kպZY#DKٻ6dW1PG"|4X ԡA 6*2Vzi;oF uoKrνv0f \ 7vysEzWKn^ C7Mtn~ ? 6<֘+r޸!|#6K.M=<(s|YAAw\lDmk)m|Cτ%Ӝ?gMՕCS/sD / (ONę;*^E:9/B#%HP4Ct\yŁaab5SZڲ9yq.]A$ 6 Hq$POs044pv0W1+56>jD}_8f8|Lf R3;LGsϛ4)kV%r6&ke*W3W)J_`e@ЋAAy2s*D4Thx;IQIJō=8S:DKgX 1QE2A8\c?Ь"@fs弒vlm]25{Q>]X$&"eƅ\\GSH{M!<>il52 [Eo>; 4T aM.V%]o; }.N7O̅s+NS%MtV7TZCg|%GNEzy1u\/hЅ֟U~g 3ORFk!PO]27cJH>),{K9Ǖi1)@R1$VZg)  S6!%FT H6LoIqɍTyd'/b٣ŸJƫl_LN"/t''*^\AxQjݞ/'o0 M<q&}MК(/[QpyOt ez t:&YtMJi*(%TR5c֌J1]XL2ԅJ)¥:E7=',R֡ջOT~_=}'T$Q9Ab *51Z_J]Ngy4e[rlq&*ᕈ٨S ª MiU~8 &hbqf6v`VTZQХ&hiJ$JDdZ3>ji$" ʦHRiTȐ ms>QfB: 8FeNa hWyևQ?hbqF5"4bI3IƬ>kc\8B$Ҩt\'%helCZm RpjҔG-XzGj&i*tNbكr 8b=yLbRr^bzt(I\FMy !lTQD"* M!D.u4cţwC**`*ໍױșBTuҚ9%HnTA*TI4#=~ǣ+K'dZч&z$zI c"Ռs"hbN U1"xoAi#cF9IiHQOS\[ٔyWv>W~{4,l/_X.)&BDq J1´Mp]ptTuQ4BV;)gP K$ 0i"(uNW!sAē $j!ˆ #; ^5Hyx6'ɇK?ݭb{7m4٪KĊ^rR^ " NE$pCg[H[bcO3O.cz6N|nGDg̟$Q%1a!s j⺴GŊk_$T=q̚ůzL~W b&%+ dd,&&DPd)ꏰ⋾YbÚ ԃ@Qi 6,J(FU= `򹛍*7ރJ rPAYvӠ!4$)UZ+{S$UD_H{y~sLz'r3qBN΁iբVO.ٳowǧEfvEϴMnں{>3|.qޟ pAz_MA_M.^#;M I3'ϻ~{7Vh{\m;roSw>5Ѿp^m <kнy#?ݓyf\M~{7aY^6#nWվoکqUR>$Y'?ft6{mmo`fr| O7mgO@( E%U4Y B煻K9Na؟_> ?1ß^]odu>Yw>A}p2_Ĺ <]~=ҭocW_Fl//IQp=YԳߦ'y|fMCĘ讯._7}ϖ=BbXg׍r ^?}͹&NSl!df-jl)>,]xW Sµ\;%5Ӏnxٓo71-ڇF՗?pĿtD#E}Vjc|\&Ҕhۿcu~\5gi[FԚ w><ޓ`4zj6|Ԇ͢6nf1 ocTL41}wF3@{æNY͂  ZLD>?~}KR}-q ς1'+QMhT2*:@)g{%;΁XHc!WN(e$H\ "1MQ S0ja2\ZD<:e^%0X"KωVza+RǾOꢯ)ZKc՞,":4+u@UZf#) 'z1!Oe8_JaG1yxþrOʵi*˥] "|l?yh/Jq{.C;)c3(yp+DpǺG3GH W(y<fM^Q!loqanٝ/05aO1$jW(+z|}/~T^\?;ϔs"*s}Yr9w?X?\-,7l]6ɣ|>?]fu"˯4/5_lR jV*-g=uv7tUPr?/_ZY#Ső94la4SẨ%2ebۿ‹઼:\(Smzr:_} ldyYX&ˣ}Y1J!筘ty=GecRH"I!b,<#N( L W(BKᶪY=ѥquUB im|1"l?ŏ~Tb2q7NK7l Kr% UyK& *cb47DBW2z1-<ޘX =hW:c!T#Jq~! N*63tL̶%{E ko[G ϡݗc0`'oR%Htq!@lX]]V~7׽I$Go>-dXGnt49:J(2Q9b@Z!H|JP*r:) TڼұJM˾=HW?i"u^@z6#jwۃ_#񷋣2Tghs=ce+j6g( 26X~^tyNvcstt:AW[WF!L"i)г)-LسQTYCzi~rtp1&L&?O˴JD9ڜPϓIaޔQ)JzV]*cIbEdY\q#UIVB֤ 5ӭ֚Fk}q55[Ƭ3yMiEg7[J=J)#69^B߽_⢤oWǟEz_7oY\.#;ƌd(=]v=B>aJgn6m߂hq_m՜uȧx|٧Ók] \7LC2z.QJszzi4ot4[7w~FR7YjP3Uz/ֻw/V`j}k_;&'_Ng gTM~x gdN PUsҢ̃34ޫhjPSJ̓Aꔾ氽' |}(oq{iB=2[om|Ec"쳤F8oMMYBA-=vh)h'^~MP/ZPw{S)V9{Yk_f[ihGVz=d}pNBÍ9|i5:7Vm9]r. _4N(bfsP$^[zИ=K,9N-Vީa B2ڵ%]`x8idͺuCu>ry2wf+kē³}5z66Mo 90$Q}cɶr{'7RZnϰWKa5/OKkv {F-\Tt/Z; 8i_;<)~>\fk}t-:YZ*i=l<6 {ufŨ|wQfw>fѮ߳dC ^%C(d[57r L.VkqsgUNȪs %j_$GY0ڢEA2{g`Co ~x4 ȅ(cWK`rsi_!GR:b@Zqڀ(̔NquŝԓŬs֞Dqˌ>$%UVش+!KFW Zx+ńNNڳRȦB*Q7W*KI!oDJR/ޅq*1c!rJc%'`B9+0O=4 /5h;FQMJ"Z T|dK~. ڢI±D6V:؜ƌD$Ѡ+.t-a6TCZf1őjJʞ˺ Q8 +CRg-a)-%NꡢUdq!{X EW Z#آyX^ݔƶq1r3E ] * +V#IPeGdی2TGp, 0SzR{HU%;6B͐jPo|ͩ B0P(SP|kj\zU sc Jo,r7_JaҩʃZRF<@HE&rZ!d^1P>8M>)3]7h8{* y辤"$Qkusx$ ې, x>cQչI,TGW>^_ż3rpۊjLj&8x*@?$nwS=Y-7؊DRe 4Iv#oo']. U@ SBT/2j -AՋ9aƳ^?Ig5(D t أtF$ a^'МڙZg$h\'5֤Y T)JKPKU3d$1rEIp՜ZO:E'{jxj^o\g1"3֠4D`+u`AݠLol TXD-m?g[ Dx,fp?2܆teS`ɛa5R7S)WB^j| g@XF4y̗YFkF fmYWK!_f0CP혊i%s$ -Jr%#? #`34AQ6UԬMfa/<$ \&޸^h&Y fKH RYc;dJ?]-- `_hjSUnؿIw/ ^Mf8^(`K/_lK_bz=hc=eGw άˋm®\c$%N -]V0HQ5T²u_qJ8t`2x.I$lD.ip2Y9,mܛ4$'8Fq[灐]&φazmCoQ(ي?t>jmO~l3r~T_|neXid=Z5\WSݼ>$\?ק㽩A i<Y w;#Paڍƍ/J-` 0mȝ2J }9z ^dy/1lٔ:JXqmttjkLv¥q)7ǥ),yLؐ=DIP2sN( e!TTdN,pV𦸭c4i4vn8OhU2uFluE~덶<3M5hkG*ʌJM Kմ28;8lx1"l?Ŏ^[JvĒnGW6o7~jfj_^dmlOל9pH8#a-FGo潉 m<6 zV?O(<#<*@JLe^(վnC!$;;Z:c !,|2:z߉8ݰ=@>z F9Oa[Ξ^[E1_7°cu+sLt,)Ns1f z44 #Ā05u.Bŕ6%nFM`i4=F'.m{`2"B62h]Y!;.{;zBesXNI{}gk#ʻL`(;#c6hN急=XYKɠ ZUF%ck-*(@Y֩ Ywوpn[DT߬̀24|?~ݠp]W%6UX(F hUl1XڗDMHM5 >*{.2½g^%vf6j [$f&lVe ͓%vk-pTcڭQǡR,Ij6p t fXzvXId} *AL<܋(,vIhȀ ]2DbָBe i(}+ f|F>|FzBД5"7f|U?oоl&c"ՌsbI+Fcފ J4IJCf3QhE#;.6c=Ш=~o4 Jdn{(躇iϧ6G]_mk~,f_v,xAB~I*ɔ\S,3%_ptT$³H册O$^@0X-5h'OI&*&嵦A;<[ B18PA*$NB`4b+a@Qk$0':=b Tgaֲ`hm8{$5џ$z litw% p_aDnC4h#E-m஘s0bW7$S.9 `%6Y\b ^'O3OIl='I3 ?Dr`%Ȗa'f Ul8 $O+ }˕4'@p9>q졘ʔRHXM̀1 ,@+aÃZgY ԣ@Qۚ] 4AM]4Fuˮ3èIPJRFiЂuLIsv')I?^:-T۳O'ϛ_tHQw .ÅIa_V;,_uwqq}բFW]X~v~_v|g\ĔmW[c'ݡLI.H7P'JE+nXAo-S7ܴp4ºj~`U7=HM3;roTqղ R\wM_{ lXҿiʪqߣŴ۳sfԝ~ `Xx77?m8~?Y?9 "IQe(e aid0}M޸ѯd Ӻ9mwm3guog2y랬?xtט6UѤ.|[ϟS4.ޝooF%J9UF "s k:\-/o|cq LY~#[>L ꙇШn[FԨr}CA67)ƽFᛆ<&2o45v(Yg&ǼM.ʚLL zoy<мѴӮfxhZrj jt̽MvZ DJq I#'FFШdT,V>Hyh/ I~z^H-ŵN7<ǝ 9 IֹDf 2yOV `@e\9}I"C,4K`|k񱧔kW%fhO:P5Ug6at#XEup]l!,gGs:Sg^̎n `/bˬY (?r.-bǝrcr5riAmo2Wڠh\(zJr0-Slg2PPX<OQ T68۠#'#7 Nõ ОTQa¬AM0T-"Kg !/HKVW I.hyRL)n#\ũz?Hb?g 4?twV_{)q!OIpq8#YFqn>>Q7B'4=e8XtBVg2ű/9u?ӵN2<^s;P%,cI+%îDI{3kW- CIܷs P\v}H" @ 6+D/!B)(Z C$&[eA(B0LqQH1\TtAz$PIDߔ }8XCV9>) QZ1 @A#)$zA`ɒ," ĬA}|7$.+mmZp>$YY{MIDz.O,&Owj(W`u9&|L:&r@PFR3Fq,?|6Y\DH@I:YC!ttlFaLSk`:$eEc,49][ R] 晲đg]Zc]W%řvF` | F,#=. B*oHRarqnx1 + X{p#bCiPUY<$2i+X5#ZK;*NwIsS)]BIg`>B&0tLrK^4]ɼjICy Ũs%X++F'5 !@)ptVRu`8bҫ%bһLzOބ!n ^!|bBXTb EL`Io.30FɃ3[]R[[EwǢ|DXoKBde$phustF|L,kFJ&+LLBJQ ]+W5kdz},QMBvȯq+5Qo.zD;Y%$ &ϔlRh3_`xOkIװ>rzj:'/Ag2)a[XڤTV\p au~\g3:?Ы>cs I^+}. si"jfs2 <UUh<lϕ9 wBff6 ,AC :Oc!KF:*x4p̡2#\hPs:0\ -eJsLeo=:h&!&Zt' ˈ*V~ g=M&-[bk8b#wCdL3PvfvFCVR9+~L=8iߦI}j^[⶜}ZRv<^ikeDFoUJHLJhj>;΋u-.|WPP[nh%o=zB?T%-C39}>rz(s* ( -!}` 'D trA#l *C2ʩC_/]+tDye⁇>o;|mYAίv4ߕJ?e[#ji/u3AYΣA[/ޫ (eUH|kAV&[̀"@<5'A'D$ u;KDI{6MΤj]1J2D9N1l0rLid%PU~D6q692K!~a(8#S{`bø k|s?n|@g;A#pi}Ax ϝFtpJJ[Ey%<.fQ]~l'd0OByC|L`7ttWkøF!{z9Gnֶ[萆QN0Lv<]e{> 6\=]/]Oϭ]tsu wgqS,@hwץ3۝0运 ۾A[j?i.Zstf?;-lzxoX4?獖plouwg>D)";ZxyN8[wzYh3[byhoyʰul~ u'6"!68#p˻E]w~+FhVr[EBm5 ,3qA:G# Q"{%ܧܛ,y_ )yju,B٣ c)^5Hw@[}Uxυ!(QX␢r^m8]>x;e16nĊ WIiO S䅼`jnzw_>^uTǫs$Wh8N. nd3|R2fUL#Yw@tO5^QjՂRQ@)(z,htRkAic:e_lSn/mSΗr6"0ORF%u)g(N8X O4Qg#VJwY @H\b1E=VI)\ˮJYɷ9ݝw}4#Uݯ;DC1˂+蠸e]kƮؕp9~9U?MO~_oIISZLB/8 6tLB4oOx>{04_o.0zz*E͟l4-V77>8inOK=Sj4 ;u~n4^14K>fߦ/gnfa/gX"&6.{͆Il=rtd,˿}*_4.C+DB[15cKi>GLP[ܿSf.ȷo `曬s`([μOTC#+ aHZzUBvL&@vf0b?ĆQEX"bWCII$nfuw$e5VF,hHsi$˙WRmCZx%mݴgR !i'̍~Nۅ5=M6z3 j\Jy\Λ?ن\#W bָZek, rF撽zvW["i"|~-YWZ2yWMShz8>E[bMVuVE~"Px$N\R1/SVs@iHM?<-יۍ'W׏EhҿmJ]lğYD.HzkJ_S+gM̭H/r\ tN sldn28h;maKm#Iy%ʲ AF~6{e f-t\d1:I3LAGɌ4fU-I2'Yۯ@e[hUu旦\ Z{(0떔]f82kZtj@> n-2E췃%\k*?CδQdNC~Ӓbk=8oOn"ч7A^Sթ;%HYч[8> _?M޿kI`Zj\)чD5  4P/?:m5+-?mJ>R\OB(%,_9o^~t1f@7qty3iZy[w{^W˼-78Xqq"HK9Գg P͆hE#?31.3f!0uW:)lu,F!A䅕j 09|R1z')]5 .IU$x7>9i*JC߫zXqUuFf߾ƥ{uXJkջ"I)V|k-xV[/^͞^ŋŻg 3$J0.IPqt?'dTc[si[H,Iܙk3Lt<~"2MS[[_FDqє>lrN9Xׯ[q< WK,~ؾ g&}F}8Cg X!Vo?J}oU;O)^]ūpjx&ǔQ—ɧOiXyda Gкߧ)yEQ,V%{N*aDen=E #MHwyo Y6vlvWJR7Bofr^5KkEvf[iMowd-MɁ@?G:D/|)fsRSbm"+)twRn(BԚHPB&~FmY"v;ND&_ Ɂ@&!99F*+LrM,0(>e-:/ "p.+@p )-z5 s%fV!af ̽Ấ0.AIiљ:gy.KH(ޕL:9IK Α@<-dxy7F~=mkp *"*Urq'й+Y3v_HpD۟"Zrr,XA"9Ot(vQ)u^)⊚dᔋ*H/rY$T91h2R jMK@!bP}ևHԯ ]᪃hˊ= W}MagEg! sm0uWQYRE6Dπ08MOƽu˭2Mp7ĭF'Z)s(Z{뉌.ˮwetxZl$՟ҧVBeUȀ*vC 䲯@M_}͂lbÉYA`e%IspJh:dx1#DhǬXHlBƅ.PgR,Y[Q'BB+cBkǵ$kNYwG$zma=v#Oʹ=|)"p|h^XNhlHc^\ĭ]۴zw677iM3e5I%ѕk.%5+!YwIb:(5"K!*W& O[˹lfDNivJ;#"Cwi^vAeJ,xb Gȡї*Jq-."qn5h#%8p",G\`)Qx=vƹDH_|veJyH I,h<#[D69Z!#'d'4\Y`DȉԞBY OH tgu%5|yLm|դMP ;Y- 1SY[ vTvb$*fг@,pLPOLQ{01D|>VR a.5㒃myCH^I|6 s4 ӄ9m`4sl~z{t&1iӳŬ_{aEfF{'mhv>8hrn(ddģGtA ˬC!s|.Ns*07mzjd;Y,$@ggOg19/=vT9@׆kgݹ⬯FI^e-=[C6 î-',Q!xfܜQAPZqjz5ZqѠӰ>a3 w 3oBƂ /lOp2gH I_C2܀=(HKAH{ tIM&x](.v35̯$Pڞ7hcN3ݻ|_0}dCĀ8wMg,x_gx]2dVK!#H&ImDݯ)}ەI\]&GQ83rx]>4aw񼡿&Ik÷ɴ<@/f]ļsL }4lȤI#O[ŽYJ4A_ ֭{C3Ŷ,L3EΏc#}s&-|ξ3t/\ 牠>7`,I|m@Պ)Y{,@<6GADa:xΚmT7YdI`Jhz^ʸj ,TuYPrlZrNݐ i$H= ]κs?"x6ڭ23"\zxF.c<#Qұ~t1lB8_E|<6DP8Fƍ %Z_D9d^79gHf@][%{ygkI8Y#౬(&|7Ocߔ&ֽ}sr~Lj'.,fk8^*Q.KD'@؉q<ڧ잲Þ?Hs|+!7Y~J1/2XNNʖx@b\.˙gHRncxN7D/(1m@bQKY*w6=3 zXBS;;L=S;]=R rmu$Z<WLWQQWaㅧK\4i>^O..d3I2J!D,^&0Qqu^!_'06 "D?hټ]܊ mB*D1J]i:Y>Cq3ngufhAtF78Izqt>| NGUZؗg{v$5~MKm#JMnaKhٹz fWs+\9GװM5wfl|J,,]/)w[VQ}#ek{׹ww4B^揞غez{8jG6oͺ9|+wmz|xwK;z^j]rՔnߒt<ڷv:ZE?bL[v_-DsrS&8e/zsћ:5nbE}4𙢁읩G/47 XaA-h4NR0Kt"yг,SJv.‚wXcE}X:10#}."Ne.񐿊"3䘍ԘOHe*Qϸ!(bˬ[f gwЬGn-=`5f_qڠM }0zLl;L]0ͷeg«rՐ"ϣ-RNn6j ԠO+1Bʤ@@@@95}/^S$Ղb Z4DG-J9kz\}ϫݠ.ktE㽹kս׼X1{$搔\ ))aT{٤ @&\bD;Ib^us;|9ݞ3FG_j>KSl/]w qmBVp%X ]iD-{ڀ@HP^iǤLEᴒ)$Z˕S#S)((j7? lڀ5:?3p768ZV74_uH)˒cg(I3M@uQJn<9]9\Kڲ=/rQV4)mΟP‰dMFˬCa)ёV`v.}ğB8E> M"*Ɍ6$e^zJ;c4A^E{c#b5.!@\\DMD/YWY RHHٻe 3D¥oufJeĚ(cB[Aͣ^kOkЊ{{ޢB"d_-U/U FUa*(yn&Un%Ud6=+Bϋ.v?ş#ky[MHrah}W۳;ƤQŹDK8sqߣEţ}#omJTΧy,=&4;/G9/0+r)$Zۼ rl NƮ'H4sDF,+;uR=P qN$!)sl0BIɄ\h!É*Ûjk mٞ4hRh!ԁhW Í7m+o1 %q"NfnQ 7l GV*忎hnn徉/,㹌Ec82x"-ih>\ E1>߮˔qxqHEixtN+S%'UBhGz]b B:cJ!,=Wz%04p?on UyK& *ɒ-.ƅh O1i 2 d4u[`k<=vdDLۅȼ/ǽ6cisq}r^*=/űGGτ6ͧ'&E#qcR@&Rrx8#T-.=2(0>^?Vj)稙92Py $%Ш1&~=4 #@0%) .t[і'T$@&wIo 4!4#2z ]Kt5݊ +9R30CigrT2j ;'@T\/bVy&'1 M<q&}.7Ak`(PTs0!'̈́NSr[t}] gEp =j}LΤvQvlEo{5@I2mS]%KJHD4J&Bt Cyoa&ٕt<#Y`rOh(6q65rGij^gI H)ѢW~G>2G 5TRxJg)&!L> MzdǖH'}Yz-_t X0&R8!&褽I$!1F}ץvzؖ3^)Qٹ~ ~;d8mߋ0&zyT_X.)&BDs J1S ^ptT !r+ƽa 3뿥FB4Qך\&-  S2S|5r=JW~~݆v`ݴLFg/W k<$D$d“!rs[H[bc O7g<]l4=,L Ϙ?) )0(@ےX# nPIShoy:<0rV`7Aw 3[Ι7^)`'^cY@;P{G;?BgwB :/8d0kPE]{v{0^ˆgnO867F\ă J A B1i$MIRތ?܌?&5Zp4I6.ji.HEk`<0"Ի/i ycZjȢ57)K;tgśTEʸӴpW[cs%>N}|h~# @WDBE{K4^ayyw:A{p2^BY\u6i2Ζ6??fmrdvwgR~g̋y[$\G/j09[fzn!Tg+=BbX&57Ӝ{_wDN΄HN|7[aE['m]_2{9N u*;tWy7{L˯IzcU?Ǔp?tD#ECm!ZoqMylnKF5*e3LLCh-#jr l{lTst.}k ۆRlavmΰMOy311^͖ .,;݆ny]DKhcAW4T,hQIng@$ƅ'ٻ6#WJmK El O0ErIʶ俧zHHc@5gtUz`dԳX1ᘦ,*U:p=0E}֫ss~raQHBjh+po` QIkGY@e%Sb-$wʸ ӹ.vkd+ R2^Y\xs N^3J:6<ךZͩgc3O.QF`q֑%*4 LYVpb-7$EΠmYO\޽Y7L -:yU:kc/2)14,crK&,r=rI7M9]&8ZNI.QrD̖{.C3Oap٧(A &xL% :Rmk4iM{qLw^.3q臟#ܠfx1-Ǜ } ItjL=w3@IB^mhRd2DT G~~Sl:Ou|$e*3/xwh,yeA'q&=ELy;=MLYMF^0gClZ7yU\^?lbS'5~a6I 3Mf; E%C.v>IBԴ,ΰH' g _{= ?ӫ}_W߽~"$x{7ux&RI(+) .2*3qb*O%) c%h&oܹ= ـ@AL׉WX#<) Nӹ <}'k;~d[G45f8ڲqTEЁۈZWl# J;J4$$ӸVU հ4֜*H5¤(\ȤieSPz 3\BָY7!C%)uUS(8z5Gz[U=/8E@SL+s9!Rx` mYhl_#4I5=c ncrT .m>^z)} ߲Uf3T;3-[{-[qK;dE<_T9e) bE99&pr(ļsM (h,2&O8/xkcI1@@"ZF.ۆ*VkV i_MGwiiC1 }eҠ3ʺX6TNgn<ۆÏ-}S-59{\Lc&<~70i Ze@mQѕෝKi $CsD RᑱhZmBbh@k@!~r"AQOЌ{Q[s.1/<12LR&1qiJ$ ` }@l JCFCBJƅ.Rߣ6S2Zg2^JSn9CGڥ\cqK `6/-9Ԯ=Yƣ F,AJ U@rxQp["@}nD i!CDÀ:Ds\ߩ͊iGMV:Y\C0ApCzN)f<4Sqľ$CܸnԼbr~ŷ d&g|%"U2n\j!S5F0. nTU6354tgb@dZ }ӓ~fŮ8y7־9}aT5Rl6Z0D%',*]LBNsd[7(>)P )EL 1H alf"GwV8+Ҟ7Xs B)"36["w)$4s S28`pFyrLK)A>H5dCQzMi0 Go:@OFmy8Sr'Udܝ|*9wox,N_GpZ/wm6љ HMd.sL`tLCo.I܏-m`'MgYf7ej2A0 A|P%WPo|o(}H%CwnZS&u} [Q5Q` [3W<[kWWoN|'9xuryQ*RY[ErKŔuࣗ'˛ȝM9AKfu/k޻]48svr5n[0_ Vv>O~ ~-pgs<'8!E0׷yȚt[KޑٴyAB;Fi8^/6Rie?Li;0m$O/7=r+WO(E"'h .%9gvкNT]\ˍKե܎lK.73)Ru,ys9&]kCd6)UtYhTXרfmd|,1 60s!I>}F9lDnXs(]˗Dt3Ч9[Ϫ|z7XV(x!sԏp [DlzxĴxLxZmOY#Q^#'$F@ p|{W@ @)*˘(| d2d:sYv{+xE B V RFA}!Z hiRNTjɸqܜ&yhTG~w x-Vy)N1)ᐕBre:*ʙm+)(YQ"8c n$@xwb6Cu_2/9{>/QMTs4n]c*ښG ,ڊ6 G "LFJ*>$K(]y*+δ%7cR"8d4͉1L^;e*iR*8Òb&lP7^1`Mw˰& uH/cX!'9Ι0G)h.xHߣs˗d,e{^|aSPVk9sm.[0pьiԈPhi4"X}& RMuzsC`hg,*H QInQ, JUI8c4AV"-ƥ`ZKHLH] (ɞ9ȁ;Ku)R2$QRHOUc͌1s2.z7SҐ 12f+E"z6^!VV,mN6RnE7kYؕhWum9 co0)>]gLz7ypLĠ+ߏdU8B:[tO<koWVt_-w oQ6Вj\rcnwl)TzѵEmW5(4o:ʤ*Z˪'k a׏[ks7(}IcEr';3w}L@ γR_\ @tQXwqפ?TQ}~$^Vnx.~Xa X o%tdR(a)orBIՈ^h!&•Lҭ_M^%sNP9XɻD򲐹X47hۙ-~Jz]ďZWFڌ\UWTz۲^\_TP\5ӢY 5~vl}a/"DM=qEx^Ғ>v{{F~8zXCSĶÛKLV#Ȩ2㺋MF%5Ym51 ezKsq)to Y2!{heW1bV/Peud򬔳#ʂRyhqsxsF*|. 6AYKJx0TARZ')K8?ŏ>IӻѥЙ5&FgvDZ8%|bUԆJ'TBR4Zm RLI9C&L3PI|2xLZ0[Z:e D"b q!Z#rZ'+f+\9 #s|R3m" ReKs~ďL{ )+<,N7iJ>E*EmDRY)fRR \} iq%m_-l/_vصtk0*_b"3!-K)!-Tɰ(BCdԿ  %s##A&=[>*ݢu})X/p5ӥ\B+IZ!:2T!Xƒ&g&!hRhcZ_JW;IPR OU-,Kw϶ X[wprNXC ׯJ"0! 2]Pe"Mt1b…%jZhmRQququqQNL#^6S"7gAZ ڧU疊TTkLٗ/}!^~~}9Vj%w.МF}1KӼhwmmTٮ/:ٜ=&[CVK5%2eGqoRԅD-FR.`0t=F$`(JRJ](іP x,i>XXЌKn&&KvEWr4Kf>PMNTi2?j *W@᥸T2Ţ= [xgd &m8d I> ʛ5QN"2:yOwf%NB^_d=II8 QQR U g;2Uaa1 W:1qku g2^7s~ؚ&tT'ޠvx<|fĦ&aΡAJ4[ ZoJE,CGޗDZ"-jr~qasJDl9l Jp#/]L:vEmVـڃθ.5ALSJ'Qj'"ӚO PK&xQU6J=d @/:p6'A1`&<(gE ΣSI@spa/Xg` "]FD>  $^qH$cVX1VZ.!JbiT\:SN+aEG(cE}cSIIqFbKb, Ud8 Sy8 VO^397^)=3)Y &#%cQ'O61aHdq PGJ*ub(a8dch۠dF~a TijwnwfKwU \7FA BXAY~Ӡ!h1i$MIRPӊ9MMdyWEES8h+NK_Kz7vys.i~90-Z41갾M }nq|2UZW?6okb}fi/vG;E :D&Dl!{bں߬r ݕbxM'PhZZ֎RVӀn/\l&0s9ѩgh4Fgq)7~UwVuX49Z!:rtvwgDi'2.3իw髟zu[bytםƗGr1kϧЅX2ũin5ϧ98(T 8mÒOHzsa˵IJM=--!b~y{-6{:"_Mdz݂~Di n_9zߡ>|pCBQk&k]Y,;6~f.&f˘@N٭̓<%nЮA-V4Z,z, xs2ՄF% [!rj(_rqs I~y,B w($AZRiGhHV "wʸ$9)szE@/K`HofӾDv]{za7}{?i~kE :wgfxNP#u2S'OI@*{^M^Hث#ڹuUn=ɠ'?/Wqo.>+[ilXތsW_4e?Nߕh\(~ 8 LD˔1Y PZgK=o;L'jYB)zyŋmgl3V47JϞ?׳]ٯH=/[O'Q\%ΩuGbxzL+#'E\x8o{xub; k]t{-fjT {¹Gq.B,dlWOT-UoRn*QgrA[jnpTodU;p0^|Zk6ȄlpiT?9o2;~{1]V+~́:7f/onσAh*0Mv"=^-IYQiF=8-;fI<^q@ [6; eBLTLS?׵-\a_p$n4|H|V+Ks!]8u\"͜߸8M }EDk1 RH֞HZ^@LH_yϿ"~}E뾾׉+v&q\TAz)\/RM 1zI1 uSG!cg ym X@4}М$-*1jit3mD@Nz!NHi">8pRZ0@EWrdb`ħ7pE~q bZ>-5mNvdT :3ʪk͜%Vij<$Tjy~Ĝ)k/)BkGJu N2"IBDE{HE!r&XLl )\H^'mm@k@=Ƙ04bM ņRD!P!iِ#3s}ߚӧX#nH&Z&&*{ mKֹJs-ΎMz.S[M;'iu6wWiB$ʂ5JK"4d-E{zw-LJ^eJ(܉˘f ,KSS4YipIڧMzY(Z[pm DqA~rh˻;ʋ\[\Qs4FM68ĽGYN! [Ls{Wzv$,HrJChxRHaC^ ##P#q8SPdh{+5Dhf@kK %i1 <p4,GK&hC xbeV1S+ƃ4I|Iic:4d"95H"R3g Y2Br 5GWR2YK?koKJ%V94Z2ca- jY}ml!vGeFFO &isO4$Fj>6I*p9J.$ -ig|!6ߌlۜ֡^ߗ7ܗJ Ѭ9Bl%89Re'SzJ$j-FۙɇA yuᾌ<䶣f3Nn=xtw %%ϣN xwZlbAۏ]ϱֹ&AF PiFiyWZx78i:ɤ JL6^AbWw2R Y'qSq_KG+12էOo +\v39xY }|MܜGnpKO.߆ownʍ$ƨVeM:IܔXr*h\VeVb4--*$k09Z25o5JG ܅l\?/Y?8>]TAwx/Fn^QWpe9ce)*K) 9)9;:('_,i9.Ͳvm=}h絣x]b{7\qS=_K-Ýt{FEʫ}fqn5s:_z*;7͏)Y9S~)Ԩ7VD#JҦ@V>ש[Ӭf5@c7rv)>S~FYO=oHONk=9̞ GRͥhU2 d%P-9MNMR(k S&xcՅ\P)&jQBrM(cֺڊ>bJEURD1ܚjlٟ#Kqyc5m.\m!D߁^spGhokM$4TK=GBIk)x)';X{ N%3$R6JpMVdl)I#6֢iAӗ:r^{YBRMRk䒔!pTT3&Kb1 pX |:5Ʀfhf hӏHR,-uLƣ%/oc-,C됔2paô~"'E#4) oR!TT2B7kLp+:XZ\$gqdW{! ^@`5K~02hoq[1T-[C'Dɐ>NVHjmyܜ4AQ55kC*DE*ɱC+Er-;sku}ds9]fm@ii fNc8Rc1zּuړ(.zцD v%d)0rc$ ˤ 0kJ!B QȎD \.%Ɇa*I%CLx8DezC6"ƒK-Oj$+sV@kazhm^jvVi። <(EJ<ɖޔ2 {henX"\c lNSb uh0!-CHL%eOVe]} CPV)O0'CRg-a)-% T7LÄ] EW h(r2A1x7 털[-t*x~U5eXF*xD68(*\z\I2i,꫐Q64Tupg mǺ 9/mlU1^-BT8tLJ7!K^J.G[-^U}똂[$;঍5j,JPԒ"4D(2 !QiIu0"E6h8{* ͗Tչ7<om UtܬJrB#+QQgjbQIIC&K΄fNOuy=H /fsP6D. — 7tTKLAQ{pY$\h B;܏v,p :t kRg]IPSҸ]p0jF z,npZZ"/x(z2Б%YsMl: I,Di9DHT @QUPբƢP~V}MD1dD cc9Cє ԚKd/tXgHgmY4g&  r[6J&MVDEKUmXH@$l@0U>oߦ=KPa?2S6ѦicA[{th﷫1hk{eM_Y4"qh6—JD3 58)]QaWM~)QdfkSѬ15 UyHY-:h4v^Lyvgӽa·UfU(9n(!/AʡC*樇2ty웡abs"Ye߃0V32T"2 %t`#!K=x""(f!=T}f^ a!O1ۉ[WT!P'W Wd!G :]tF-)B^eo2LAՋ9aushp?{.]ڈr!cS;SvV~@4Hk֬U%(eg,AU!WT ZWY[EpStreAVA)-7SAj+v'VupkJ!-b V췡n+6-7BZXk^k"H :K<檇CZ= +/ LD$ &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &yI $@2RKz8$%z0$uL X@?# 3bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &I ߇C"`pp|0$'@JL$PJ_.2 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $K1`H8#C!w[IIR(@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L<$K뭣k;ţߖ}iqy}~7p߹ֲޮ_fy|R>9;:i1 u\z?zɪr:Y'ug:E\,?wctY`Q^%wAw7=q au&1)o/?vcEw<܇2yX=:9O;گݍ >]AUݐsjAXAB$OOQ|D(n>^au01RHq"ѱM^k4P%5H04䦫N^|p:9gdN Z)(Cp콂 }mBQ^ BGzwX۩0م{|q=}:ưE\Qs9*jUd(O)oK? I^R=57BLv„Ve翭~~PY }/^,񛡹Ugna9m9ﮯbLΧ~ݍ#bJT! . $X A],`{`)C4IfIUqg̮~zJR h-iym@>M.< DcBa2&C4>f>$lӟ/Ji1+=r(3['i bӴ%m+tA?N0:a~ӟf?5 W -Ծ~$\ͷrڛX\M;Yag= ;K˩?' x"+>1fG5FTz%E4S\`r 5N+2H1gcdvӝP9G#}?G:c;"WI)UxK;M{]a(99~ni)xLa4^LMIS^96rF?u5@T ԭ jم*>f_M7R= R#C5e@r=aoP/>OAW9n]6~;[mrO$'Grl].,F>'`]OstBĴ~>v~'0 28X{+;|pA]O^Wøf]=%ACQ[hR:rν"Zd]M)1Letpk|oБ:U @t${܃PN9Oy`ȱgNE^( }ʤ7 TJ-H"x0B(:jΡM)t1%1>Uލ)Pxg5):ㇷLj Z( mW9\*&^ }뭿O (%Rm^ 'Qno٩ԁw[ f a= &NN]mk&qJJtѲ F r <ľ$CO<~ڮ=N@NO+,s }o߿5_U- s8%Q)qcIΈDCS 2` hP~nvKHvb杣wgz&z;ٯ9b jP욓w 7o6rlf .6 4X6?76/tPqMupAl}0U0ClH% J BYRbLt^Wg7\r>O_V>RN\K}8Fe2QEN1s!z VP]3 д)A(ox.:EK k-p} 'f]q:g0| ,E#B%굊֗[euRJA N W;n(3(ˇ_X[nƸJѶt .5ʩ9/8k410rtHN.H)q. RMZ`9{@|*cO 4_K[9dSt X8!t|@&,b `3?e:x w~ւeV>L:oi db"8hb̧!YR]~Sq!$0[{ #jup]CMa)wnY3{Ey'8f/M$Ԏla&;e^GӅ aܮ]?b%lrpN>:rJ׎^ޥMGMJ { O~n> ca4q4iG^pz`A?̈́0H2zi3Oh|| ̓@>. Ix^"h:ޓ$CM@(E>-}xch+GsfŦ,-ޝn̹t@sT)qjTuO[c|E"fH7H.Tv))l u\nL\ˍr/vZD aiʃIs]IY$$v*l|U[]nՖsD{./D xVn^Vڊݷ^>v0Le|?L]01*ydiYX!(J3dĝ/ૐRR4vġ: J@]8d . W4-vQt#c4li/#u[*FG@r,6 >b_.5EE0 e(b񷋯E{%uJ8Z0O4ISL`[)A+J( .J|uۤQw̫aqp3P}vnӎ[:L6X6o^@VϝZ"OShC=t48Ѯ|-+v cRFhfgx*6ŢkGAHoYFa`Yk%;,iWomM^,tU(H9 nՑe3(8ű-{؇E Ibx KcgCm/j< "7:s6gXnX&,I5 9|-x{.? &6sv幙ς_aXA)t* Ap R,<3)hlGg!ȥuT D&z+ZdK`zA:A)"gW2 f.uRF:%Zf/ z3qlR݈L0 9%^ilƕ:2݌-joM7ҳؕ`^Ԥ+)q)i266AޜV1|F8^5W0=Obnp12!-kCSc-w(2BEby"۶f5;[$m򪆷MGQ>@IZa͌؜Ql\Ժu8v~X6ھͳI%O2H޿-5}>=ta=m'XE]yB%uB~k.ng5́PYB` zTҪl]W •&dEbe,R-qvYQR1&"VIR@u;r]qAȢvc҃})񾜩eAء3#ag B6xR@J2iHL1%E.рw+!-zxk۵=؞w?uz7`Tm0nʶCZgBZ_ iNG@YDn Ī eR9r> K,d& `ݓ#+4;o :[zXg9 WFkK5* "IҖ”,"Y(J8g(ZuַN: JJp)zcx]x7m A@=${NB ?hJ$0%F/bЂlDJƈN.vgLW\P]\s :%3+VsJc.{+UƹQ:l 9F kEVTٗt.k~/ >߿?/ +uLL a'[&'ʹ) ڒrKzX,F'ղN5dzN\' µD;fl62P/@`0z߾Kl]j=,XUl$S:֗6&r-5&(X_=8!pQl xAdKuK%B)Bb]m9%v ӊy)Ru*ye;!ؽ}Ƶі te(f1Mx1<̱hhADdV׭_6> !`EG4jaUF0<1uQ0^DFJDQY"N"vqk*pŹ.\dR(u,i0J8l IDyU, qVR0Wa" ʙB2묙4Fxdj-D|颇%:QɩrQU\21h*\I#JK*3U XZJE'_C.^u*ueyxboWa8]Yrn<_pYi( #3E?=}7 A,D ,o%% xTX; zH r_6{ ^;-;䙥u Ĕ ud6Cp2g6Vg{6W!HmnnZ4)]#[kYRE)[lYb9a49ΜswcԷjԌ}rrWv!d8.sJ gJqbJ\ "p1(KLu,hnrUAPih wS ,IaDEP^kJPsAē $j!DL=wf q~ɇ*awk}wT^C^-<ڟY KCR+@DrI.x7$Q.yu )zK,X\b ^iG&ӓة<_)'$[c+t$Tyn+'8iV4O. 2_Iҩk5ǓzL~W b&%+ dd,&&DPd!iO^ }-2{-@lDPYF׻ e5 0Fu< gQAf!g,BRFiЂuLISrSW۸6Osn&/,ߋ0)+)/%B\R4?uL^-Y4Mъ aze*עbe\ulpW[c N}9( (()G͏ :!b.瞸G ;ܬ?h<΅cyTjMwHo]*©{ķkYv+ay/9z=bZHnw:Rfx%7Fb2 ͏w͵"jj߫ f_kbg'Ec-.FakKƣF7,~w~ߖ(ղt4 Dpq돓tF0Ӝ|W/tu4Ǣݩzzip.κ[oE?_? a(}5J85<0"0/ޝF.o1?wVznhi6ҴShB,Zuq_n[0GCa'-tI[.l{9J e*'ե X_]'=aM"_h-RCJ{݂vDi[ I]msG=33d̸e`*';&1gW~o1on ;fwo`7MԄ- Pn#b oc՜M<1w~˃.;nݬld"-v jFnrdAu-q ς1'+QMhT2*:@)gk%]-7=B_RKZp' ~g2Hu.v()o0\ND<:e)V^ DvO='Zpuþb(pI]#E_KU,/rL68Hp]l),g̔GS:Wӻ[jp>R*E[W{,Jŕ'*4,br}"V0˫^w5pCS&~68Z=ʡ2xLy&E3n(*zm͚䉳s7b60`e#uLG2aЂѲuDtx|`hڊK.)T28ǜ `כ4dtO.K5VC.䖴>s[ Y"KE w%|g[GŻjE{Te~9c©_=jPeI(ޱL칪3?8USgz0esmIKle4dU!1T-!ITwpm4_ϑCmu kxDPuTߏ\p~1)#/=̯5;I\5gSꅇwĮq JV=K8]G8ɟ B,޽?+xYNfӚk2,%(m8#iBaˏh!dr@[+wZg4yaR}qH:ƨE6UUc^e%Ͳ:RÖmkG^=k5C廳c(6]onj$;bDrqQ=GMCrR@)u 7(ZUM\"Ԛ:\BEw=TT:-m+h +#r"$:'TjdE {H&Z|CDPMGǢ005Jdf0ݱLSC$8 $T筼Sʭf:a$c+1Oʛ(|.z"? $Uum9"% K!'dldlfͥ ibc(즎B֩_\۴bh$mPQK$Di#Zw !uBH҂;lE*X5*X2xx{('lwAv 3RZ7Ⱚsջ%3̹RPft̓O.lGE?~\-/1g[ٳ;ڊ`B 2ITF\!\(t``Xd'W@4bRy@2Υ H D:ikHdR #zj1!15cZi5mWh"j =d;->䈫þN yBG1 br%ib,M{R12Wi.ѱYenFms?[Ul;mɨ`MR9BG+J2S -QQ N [uQqK#O$Õ+li.@hfӓEUtwO竀Ce2(nus>2洓: :E, i7=2yUB :28l@}2jXءP@6\[#n g}r<iAd U2L,۴,ن%"Rv6= oGEя>-g:a cx A'o ᚝wC~X'K.ce/(Ce}٠ ŷ9}b%lN1 iia\O TM<-ڭ4z;?U#W.b1v;l8pEqחsdXPb.6Iz^ ~^<D;&KR|E}}\~ݫps_ \=Kb*s)eb`B37dSTSlEʼ-NƃD  `Nu~rwɉ+Zw?5}L\|"e;[;:JkI2 7]e;x^Y"|iamNNNG|^שw͗=W;x{M=iҫ4up|ER$RHHQ^M,%Id 05Ch6U`%ǽNZw; 72x KyW|@@MD][. j;Ԋmѯ,LbX{t[Q̭޴a~͍C;<'@[^e+Fb8jPL)ny=3g6,̢jɧٵrnY]y|ͻs/ |D ϧ'Gn O?oO'%^[Tx9oY<+w;Ykde+.}mƦMOxey$W]kݹ$C_|tǥݼy8';*,swLWLWixS]y5}P[^ݬT6sy;+XGLWe˩趾S[]yJAiA9:sApIpl)8I6ud!GPT`I1i&EbhBi|֬9w{-pyosv|$~+$mfx6gHzenOMS/hj*fz\eY>_JӜyP3tc鐜ZS)xw,=#~AӉ$DWl1) !}$"!_Pa1ȘFKg"gpidsi|ev`YjMfuo=MD3tWX4|ȥ g4kPE BPPVJMtE$*Hfʚ 9"r$7۬}:[lܽ/YӔc`u_C^+}|vPy29gBܔ^no6Q,2خdz0+rVkħ ͺ 2z] U:\[sVI$a`۬}hɷR=c[lɑ cI'B(6x#4%#98<J6e*1gH}0٦\=#ylɒHuXVļD}_q̷TPZཨh̻ x٬3"$ϋ7TMK g+UT[ amEܦ<7 oo֜. wիtX,XLvĺoB[)b>9R zb`6Q)IQ =qIM}˷3ܰ# 33̅W"M:{@Pm gry^yw?nZ|pio8v|.IףeOvb7-H eD65$ "d!MVb ްib"c ǵ$ݶ_|?݊!8f.w?=x=]5rsۛG?6L~\LЋ|o̳[9Lnᅽb|_²Ny۾Ýt]NR6PkH-\V:PTV:p(enģYlCWшOa 5)(b1N$:"P;V) []nS "P3[O-du^jEoL #A9qŅPEen eSQ1:uJkZ}8tHfkݔgQ6x8^b3t'Mp>D}@Y F}=7~۝"e1\R(ǟr0i A/&AHȌFjETPAu9NWDJr*Mg%>kv&osUUTy:ߛ$, T XNDXoU۝0U8,&w޾;756EO w[5XC[`<}c摃ck1%i@]:}°A`ARv>) sy0l1ݻbn$SߐAG?$!׆fXRKs1`],C{ dSl1=y>?_CA TqJHx=Tq6A JdQdlôuuFiy@)eT$)3h^KM OjܽPdz-|v3T>͓T =\/3 /^ST $*@2LjPq5S\XyFOE%A4EF]ER z28.Uœ|)]NVZ@) 5ndlUaa#zib,l>Ufs~q˜S^b|p Eo "%=[U3m޴4VņNf֗uָb(ɶCȪI Q |_ɩ\M(P.oj՜;LҊ/lvڪ1j1!u n`I)C8I9rmAED :io(:l!33hB (Ȫ:w<]Q3i,9RvoE, E[/l\l(ֱ HK; £ Bˋi< 9k )\ W"_ 0xhy7F<)p;oֈj 392ûczYx#i_7xi x kLD hN( BE`jݨ"KJ]V0O99zBπ<_Rs5mfl)J품Yem(Ht8=j'7WpyfxWǘGk{XK &LxcJDO9I@  v~suozZYġu/Rڲ;w7 G# ^ЗUnzB~͠t$AB"oG'*!eɒ$J 0~EVl!`t}Vw5{I$9-Del]& r^|6+'~1ǧc(k2ֺwKUKkp1Zf*gޯwx-z}3Ңθ*I]e1Q> CNqaV' ">LU qbM1W3iztvrzq~:?ȋbUwn/ʓD]zF۵(~kQlΫMuY˽EY1Us~iL?\=:FoqZŀ[|Nfޕ6r$ٿRhvIy4Ysv# Ts",ղC$ERUJaK*fe%+#2_DƋaq$n?}e*?wuvFV1~>׃contzp|F|_c4Ζm Lf1Ż.() 珽%ƜGeX~Cu2:77-~k=+CJR~QXY5u@g\M}Ubqn1 ⎳cy_=A6g13Cjwhf:.wl6s\F癪mg <mk(f{( ΰMOyg=捻)w7;47gx4Y2 DtC=-IOwo< e£Y081'QMhT2*:@)gwr7vU"|i蘫`iw($AZRiGhHV 9ŢqIsS 0ݢȢ%0ը-dk~z5C)S2];f31|n椆r0n3ב TRXJk)$t 6ۼ:}w #z0w*kIx~0tوv|zO3~0oy\g5|eޱCS&>Y]r N6eJ,Lf7J5%Q T68+ K1{0-hh;yLMqf.?K\LY!Jܠc)w2JIǵvxdQn_w\toSl53zLm1m3JH RRF׈-$0. LYB$PPnrf!?NjMeD.48AxgKP:@TOS$6i3,V-óٰwGChWëq-SqnĪ)73nn:.JӁ:eɱ3c t$ÿE G)D\T×d/ٵ]< d8^M\)mЌ:iCL5RYmV X   pao#1eG!SOeT%s$)&p*u8 rxKNͦNHL鶦ٚ #Aq aQ $OYzQWY@C3 ^G)RtH18g[=>!IHg Jp5^{W,=ޣBظ_qaWNޗ 5C1]|f6(/@ _][\?;ϔ7yZXn& %Ōu2SJ_iV-(igg;Ug:+fZ4-t:uv7t}~_ZY#Ӆ#39lIKjV4ҍP/*| ctXkҡ{xs\@EԓSeIMVGјytHNLv¥D~HGHĆy v%o"H=G %SI#JYpSxsV[]-QgAYsvJx0 Nu8^_ƂX;bCus3#GCG.QiE&hY#QMR8#TfK10!:Z\SȨYAZu!Z^hGkYOP,vՏF ;fݚXxdO柃`'8q=69OD B\Yym-m+f!gBZR3Paa(BQ7(B -/Gu @E*ZޖI`^P׫B]wfKZJ@2D'R(U-& ۘI¤DH<a U[k4OZ` `v= *feerk[v{{&g8[RR9tZ\|}"ep9>_[Mצg\}X v[$CLN.%Yo KFkr F'( Dsͼ Ezai&&ČINOPE !2ici׃hڠwX::Q6*A&}(?ek4r& C%oIJ:X*= GmUbLB7NbpFcQ}듺ՠ7A&Qǹ&[\T'/q ʋ1~t/j/\t՛.Թ1||wߙ8Q;=*o"hYsCR7(Zd…ɍġgLZb(p^2K!1\ZBʁؼΫCeVaj @,D+Ixx02.IV6@{QW]v'M(<-eO=9q:`Hh#ii׭kt,n>F *Aij\̽yKa\j36th1 RHHZl^@Le}~ Y6=q4iB$ʂ5JK q}"8p!FbJ^eB P8`%2KRx!+ Ħs?khT`+\xDƋ)Vy)>P-hb,lqP{ũ$2BB9Nuۘ+> 4I'XQp3Фm †AFFm-|5g$5t2EAj wj-*$˃Q' 4g gO=k_%;?nK;ɅDt*¤Ҋ0F! jH:M]j4)& 1QA:hN=` M[1 9T;D!F'U5V96wMʺI9kBk N ]3\'H׮GAr&b *7(\^ e3B|Xg:?lD$[u>h,:'X][s[+,=%s,\ڙl6UIjdfj h-eR!%98b#6苨?MD=fJ{J_ ɨK@HraGts^N|ڋo[ΓM<[n{޳|7yr#U,C-TLYŌqТъP0*/MOS 0̐d,/`1P*2GDRǑP M\g}r|泥y4z1Nj%b˯Fߗ ,/RKO'履2֫p/~`-$U%xT'ֵmߍ/.Krjz=V)VMEJ>op2E+ޜOxQ&-ވDT݃%\ GgVRA9Vs71C`ڨtzr]mVѻs}\,nO훓}(zs]G3~62P,G8fv]ֽFg)"L|0YD5j:((,|rX=cFxPn'wmKTΈU×D|g:nJ4b>Z̾ynA!n#/\D𥧽 B[o m.-In mrB[o m-xR m-&m-B[o 9hB[o m-B[o mY,=H%gW|tOdXL[NKd,tBf_)!4ؘ10Gx-9ؗ#qĈ2#HR 1d>Ɠ`rMZZL3( Aq+%b 1PJNZBS R"# e$I ;=yȱH'F%AJ`E0YD`8FfVymrdfT^F(MB~8'_zG,-=O/Ƽ y_-6?;+OWog1I-a]:*>3 %~895`v?Ջﱥo ft^~-eF7VEkj>g0+yO@'EF\(rFs wܿqtqVGVzwߗQ$_^p S_1a}{ *Nc(wuy籗geރvGnjp|wO"6t#8>N$ٮ|7=n^7[ ?/W>]ܺ$ث*k[C:ܷ;6ǪEf]yɘ Uƶyǭ;nݿm-Z=dMmv9 e-j;4~ˌ{V롚rouuYsQ닟>p%x3Ѿo9a4Wjrn]YYھ-`|2xy;f] R `rsZf݊.2/H9k"SJ_%R:j oc 4Jl6 ֝zzw 4|~^tЭ˻j\q@@zEn S?CTvԻw@Lë_ZអЖ~Ty鐝ZccXzNgΣY2!N > Gd}:>~dYBT c54&U9X XgpiXӶuܼЋwxv7n9)۬6il܌xF·B$&i.[KY!13t$3b4Ŝ+$̵LAM+}8P 5UĽU8XBF]WsF} ,0%ߞn[ -G)kM]_)MQr[wBUܵXR ;/P\dq_̝rVNʂ]0ҘUIbyƘUaX3s2"Ӑ|@>bsQ&8s=I92TwkmzL +~}ދHCee}V2B03ѫ!{ހ3%\h-&@TY2ef1IJSΞ]N;+ wRPcST PIh+EJ'"/)P%* Mo3ex$ S 3?ޮՄ晄sja C8^fY& UYoAƨMZ+b / jDM/_0` ;Tu8Coai/dcp.gjR5rvqNBvFbJ: "wݡgdJP:XJoAr?[Fa"_[^[jyPap*8-P- j6g+m A*+ 'T˺]d ]FĜ ?Ev{tK1N,[VAGz> -FB6Y(B`JɎI 3 jc0!ĒsHKKX/so$kw$?Y'\;k#rYKm}/ye^쑥aGn9ًsY/L;,7o |!dΤn4DjK2k꼰%B@J*^ ij Wk{1ZDF")E +]+7vDDHJJ&V6AY'!1qh\W>k|/Fջ_daP:K#dE"$ɉAF! c YLMx%zdOoll~aVs1n|~?qr:9%/ίގ'.e>QI9vZKtR18WuWGWM#J "ːf"zB΄3X!A(k*I; %j9??g,[T1s8s$ X2C^b8刂8ǜgS1HB"(^JF{fLE׀ l%CS.QŠEkZdrrN/hA;{T#;O)T<:9ȋn>{1?"˴5 ^ޠZaW z"R&#*2L !9m`R-wi#77ݐUOe IX'ѕl(ёXI Úq82.le oG ipp]xUE[̞{֬㲤_4>Ɩ6ޚ~(Uiϰ3zX)Q̐ٗusy`TUU&sNVgBvT)@;k4[X`Pڪi`wdzЍ͜2$:+T )-2HtDv\d2#CE#jFV2pd_ĤA8t>I *cшC5" iĦwx T1I|:*(Qk2[0Yc5e@԰uM:ܸ 2uӊI9+LZiI*,g:^sUr^4EbӋ;-ZE)L Ll6EF '!0ehz);ZC;><+Bc2.' 㟫U?G)+ah. ;K:$vkѱ%mrt+uz<>"нBeɄENEP)8#trI)R\@dFJ$d4l`_+,oJyUv%;f`nw.+o/Rӧg)@p 9I#iN$ZFvMd p4t|ѳOiX >+\/QtKݾw]XܫhqiբŔ֣; t}n]_sbZT\sEןIwjb,[ϩN/=MpAa+PXAo!b:C?N/ЁOTtFQ5ukƌqUi[`+be{\1 K]Nn[?}a`FٲM?[s'|' Oշ9bݼߖ"IQ e(e% `uxkG_OFCɯvoLzOtzq+v7z4՘5u0 k-_ r`t7#Ef7HGWxyMg_͚+1bNj>х6x Ty_ s Z?Yf?=R8&4k[fb;)Yi'\{n5CvƒgM>Zg8*ֳ-&Ai"Gem s~3XAKT 'gQP֝L=X f #Lk\]pNj &JIQݒm>x a^}xƘW_z:`pAuc\(N,^(>hiJX6LQqF` 궋C7mP7< jbY_ߏ|RRWX}\̀`1g4ïU=@:f*oN@cJP:qi'VIh{J2, ē`7//2?=v8-tA-v;>(P# ϼL$R(J?%(?H,C+AJP|Fd>(&c[&@:gtLQX2~<wtK=Dw3Zlojr'r5Ls#mo^4eϿ.5PL)e@A[&S &x=%y|C ̼ Ioh*0Z Wafp?u\5Ka+-TH=о7+0RYŷ+ݛ)@\6M 6;֙i%s$)HINddd E,! ghRnrfZSeD.4, 3xeR11i5EhIl:vcRgSӃ9Z$/agaxubқXVEa'mVmYэ̻*Scgh! f4X@G:HV ~+`nS.C7y3W101 5: Ndc1/єj1@ߘ <k=O=C&h YfSPAbdiLܟ%e^a\hn!,fFZ-Vwӌ~g~mcSŀ wMfu@lǝnuf 'iT`04L@yjo.FH Yfw\ m%ړdVu>ĮZc$zȠo\en!.ݽRWol"ri"KS&6db k <#PV['LL$ Q\V!4Ge\)aI'AvZ:>abpsA??Ŏ~PNRdgO7X68K|4:Mъ*aAJ6eOVoPJy>ʍ8;ȥuƔBYd\镐Ȥ9eG5!-cc[oNe=N餣*erl@2&Fc!5<ǬuYx%h<_9n M:njSpCx "djR_ijFNjl:v!3&[W`06̾ʋv=~n7a|wsb( Eҍ(=tddM:wރ% H4*G*HLJι/ dK v9&1,)ɨطßi.aO IY˫t¹&9 ,l0s}uJոkSF0P"C bq5dOޛ^]o9q=-c vÆ̍[sZ0eMik. \7Qz I+.$-@تM3)B&Aw|Fz!rҖ:o@9 #-ݗ^z[f&yQf>1Z.$IJ@1Ľ LṬP%FV3MĬF<3Gaz?Y˥"MmQۂE&'::(vx~CZ$m\:q(zO{ABzIcq@NPyi˵R ` 3C$ $h种M (z KG_'jFåXQdac2:Z*Ht {̧,-A8*^R?{68'«) ׽b'Oٴ]ɓ'œٽ>ba;w;jM뵔/^,W]‚lxבn2N/ayX;|-L֡IuXB4yT̞ V(x\ Wb"^D:*v՛UW^#QX .mVGXa^y>onQ~Ÿ΄Pccv)M?0U}l_=*-kNouCqBdBo&-!dH!c@ZBFG|]?q KֹP,D}R!GTZN䍐#;[ P!/\˫]yQ]G5 /j݈n %&KCi;/]Ux9A=˷LH-XEb&gY.KV69g3^&y@Q'ْ2$.Ł: R#8v@865Dޫ ec{ikx@J|⬶x¸Rvo(TΥW"sZz"iiQH1{F1cqD)U^Ll:$03\ʠ1 1B;mB^y QjA;ͥ9"+vwm%)AV?_d  ~eH I|zD CL^^[}NݪS\ph(LˑR (sԓr} MTL[B*1`rEP:|A,HE9 ҏZ##fy) nǤV yj4i{ѓY/1[BL{ Ĵ襤Ur1>[j0q݌8+}IXf=/s2ܲ2 ;fш~I{26{XڍjB+:GO e~jឭ:akq%}z翇k^ViwG)ϗ| F?}JDrj&\Nzl':ɻ ~^6C%w5NԴN?ۏ?|g6eړ}gbv;=m R$2WEa<$^:VT"[xAg^j=ݭiJsR3b߶d!%yE$Lk!z5=~\v\1UHwt95LtЊWv==k)+BAD8Q'R箳BN;'\Jl"yYnbz!?X?pWDD<snrh2z"r*.9zi4jmݓt'~gAOk:WR{:xbRF# V}}vp{b !<ݯC qqqqZ ?u.ԅS~O] ?u.ԅS~O] ?u.EiS~O]Z~OADc] ?u.ԅS~OS~Ot=N?0m35bTK)yy@K }Pdyg;w|è5Qs-ٻ(A.2 K0D0q)G(ZfEHOm8kD`-[qT!Iie/ JDlĸptu =&ig+(~yUO xss܃O2:_|U6EDHշbuC$k.%sqʈE$2DE+^&{uRA8<̌z㲔J5h\}Jm3f͵FV/. ]bB G5lFl rڛPʡ RN E%(]."Bceэ53cqu! ^Dx5Z=-CʢA&嘝9'h{s cqzX<;w׍5YȸDtaC|HuBBO6yd#2oi|b-jsnai!gdUeJҾR3)ڮTu#Z) R^`!-M(Gz Ny GHG8"ڧ}r1_qtapqs6x8]{K >M>VBԑ|Td:G%)1$),ѢQM 9tvЛLLC^GtK"+R#9U5qm6=^ l:]]nޛ.4=3WbjѾl(bQ]h%Mi6I!zäsx{ ̰EE e7dLAY8B2LpC90q2x:r+##'CjﷹU舰zdK{lNy*v1iO`D!087}^ hB ٠ύrrjoED7AFłȌo\SAoA@!hp$΃R=m11y39zmЅĔ“HW?O*-hFaА AP< hMM>`&~.dS9TსЋ3 7{~!tG_+N8E}a6J k{ O4qt8^vܣ1^bĿBBg|N'?a[0wyN{qpAō]fn"L^g*'m\њt 8ĸéCN5x;'͎/E>Å#;MrG*f:[οsJ(rSrݡb]`5O]Z[ 0'w[w(ь͖G?=MdI7<\'w6gWziM3_DMgk5KoϾCljG8*VyVߺ=ug\_Ck_A ٝ$˗H\CNVG^)捷ym ^>ir2ܛ, BnDrh # /eEm{Pr9:G}M!"kBl"PJ*@#c>/M¢b}4&+M'\ӻ Ipٛ"T "wʛ<'J[& 6. HJ+.}1\D f XdVA& uv9_lnT%rsN"2iH]n?EYzoҴp6dr:(%jY63ީM|{W&{F>' 8>ȱ|]9ҧ*d&B"7KK*-*EBL2h1@Imֹ!j>+0g!%O ,M|]PJg뉒wjkIq55L([ިM{^ Re c^ߓ-w`™R aSa5 Kvm{ i _WGYe=]]@Wl@fۑ+!-V>_/r,QݽKxœ%cd>6@]dp = :xV! vP3A]w 6fbIڻ 5qC 1O2t)BCQɁh彳` UOzߗ;Yg7?xmhwߡ@='A8̲蘖5 Y:cޢPF´K`.{`R9mA6wh}KstMQT[u g$1IvW>AHi*;kCQxcCj___yǒ}LNoGe;NySGP<ӣsBfS)adKRӂиg^$SVc-C-ބg{6W!.~&Ŷimm c6RRw3$%K%2e)$g8so9sXDg' q$*Ű[)r-3jkùQ* bi' ?d/L.}@ƍsUVQddEi GXãy4)V9)S*2 %#0(YR0 LzK oW12#XP1g&H ndlOVi [=p9޸ ,\-aaXRzc:K.W͓_7pX|?##6:27hULآ@zc (R(8оtʛISq%1JʞQk % 6)+KiHJGt`ZD"rRح nĶ/ǂڭQǾMZFmҡv'n-'d %LbEK<m=\5vEdyDQܑ iU3АA2 hG)Er0 . Je@W*HcZֆs7VJXYR;!+a8.-+ktFզtLn{Ze1ePJtn{,%U/KI D5@@GMdPy1G(A"H$cS *t 2' Ɗ#i!0H ?HcWFA !ÝcbݦO۳xz!w󴤟Jݬ:,6&0"7QkDl/Il qTEE1!O[btt8N9=R TY!@DŨy8'hG1$!Ni돰J^3}ŬUhCpH]PԶf7w [MyYa%aʢ2iO - 2FrF `0WC[봚pG4[NBߚqҌ hZ{ef0Fg%` =a{<^~g08 q1aMVd!sOƓbZbz>L?M..hArq}WQ`+?4WcRCU^Bԍ?Ufz.f 9c`+&'#lFxfKBϓKP v{pV&b<}tzXȾb K%vp~FT%?9R\yu,$ ~hN;pNOxL3}x\yqHRE:T.22 x}ӳ2e?Ⱥ̆ҴW)yi69 ȗU[́RooָycP5=;wW7]yW ֊b&oƖ)A8vwˈ%wOfg9ܛ `<|Z|Cj&iu7ylx1oĴ75:1)CszĽi]Β ! Ig&|dS7.B$Y%N@>,`܋eS͘Fai/7Lv=M"/ sy~=w9E" 3ă8(%STN(6XeL V@/QywF=3Zrc3^SRPM{RD1Rr| 4󵙇_x. %#mrDe9ӔZG IZ^Nn°GwUVǞehx#wʅ@a,/W^w9|vE6ke-e€L@yM>jb w幏4 JꭴR*+@Ԛ3)pp^`X?mqaYv UɜiLfL22żlRE4{]?vr5ͦ ,yGYw5,%C2^_ff䳏 FL/ǵ/UL̆#zK^M XZak<ݫ :E#PðhV=' Of"RdyfUx]6 1LwRLjڇ.BƳa01z">ůʓ!#Ѽ[]`6uJL/RVS&"%g2\U^z C7+aP1e2.ytÕLp5z&<,N+;aRMdiOیMU-*by5VVN CnK8j?Xܠh܃^M/Ur^;/I_f zp?Νw(mzraU>{0ߩڨKMm\q څc\8Njb ˊqK# $g\KrURfWCS.a SXG"ӝC by LqYhV qb#Va'!(2em/(B:}&2+;)!)[4gIO=M~i%a UvW4~Ջ:/5yLZf^wr3C]fP~|k&aeo6Z;9>lf06H$Sg!de8 +;~m~sÅ*\H6`jLun?骩{s?I^L˹+Bݗc:(|vUm^fazc@]Z8qo(Agoj~Hb-plK3V]'+ x崘,oHAHy-9kW># O7^m3(q*87.+u$4>x*4$bʫ4c+pmP6s"7ul S ;X Y[z˶'`!Yc(Jl@5lmԪ7EP#wo@P0*[`+9cq_$ŜlMoVw5T퐱J)G̙@:ft:/im÷mLvYDo{n@B]zU7 ޕ:Նq2M!l'@C@ e Rp: 7s"]QSa*xk&"E  RH Q0\w(scy\ % yc=n ` 0&+tpob:W %a7k,X \!Eg֩(l NZG Bv*Jq]Xk \"*&o;@H̯h5JY8P,il,@|q1y{T(&cD ~HBӔy 4ވa,Vj*p1c,HQr!BNȈ ]y;3[o5]E3PxW$]tTBX$4j*3` 3i<)Ԅ8o,hh@EZP$*d5kQKո0\.bA^RA&`1Țt6ZG%@܁mcKw VESUߨ~N _Ey՝pG9NN; >omU~w u*>J:B`T>낷^R`*1w@:pC*).p kXW#̨v^Hi e@w 0F/Z%|KuRQc[$5@/!lcb@s:m)E1/(+VG8-֜&J֛} u؈u!,VMǴ|TV̤xl @GLt6EY9m|c k MQfBT@΋F%a[7~Sޜ 3ҷ^fQLn:09o%׆j>Dt##B{ wvѠD=*Pq:,3+G3':<Xַlc29 [QW HЮu]rnC;nz%rp WυiDF9z%q^{.v}J>,곥J.MMDUBu*"ZI.#!̧.P,tQ5**\ t1]u+zZ c?`j?&n[8ֳ˃CC, ҥMTQ@&槒#O׃wMS7| `M?ǿȭEaA[`k7T-4 #]SV9*?e崱lHS@Kjܳ٦kڀ~4sYtuv:O"yrc[SlgeWW''_Ms}=k-] }Hmyl|EFoo~U_SޮNutGɻ]Og%d;h(Ume:Y**AMK!ޘ~cs)x]r(hCׯ%`: lQ7^Ý_Wlז?;:}V>X A4L G15̗ܼ`H2s{~[/G'uS*nݍ~u5,"r{* <2`tDb;Z5+#X(z,qv4|!P I[|5@6Jw7Z4ΖX03q=0 ~?#ŪFVXs}(ٮ{)sʭxX4UgGvJ޹Ps^|2-![гMV-e@Š]O ݏg˟~ޟ͏: ?}]_u4w|F:}Jk >JvAC^b|4%DRw~R*]A"c/Ο66 ܖ7.\Dy=<䬃OJIߎ䇇@t|?~HN%h]gukPl*s5f]5^yT}j?%ږmW|s > ld5U6neG>n*=ȽImH$\xDgo>iF6##7xiH\a;!'oOp ZRlN&]0, +4uT7 4ߒ/4.;*ƃtcPs ϫƼ4<Җۜi.RvMcdJK6[ۂF$nlW- {^.fwǼȜgx6;?}JD>:8~5ao;ᲇfڔ˷lku~>0,G?^n{fz=][N>[D_ooO~}Ji((Idsqj0fJMٻF# #A!.cY/~ IIY߯jHJ$!)nSvgifOL_UCjLT)bSp̐\"WA .6g}r<fM?j;$g33VEhzEYfi-Д昤w"S]L^*^%rix#Ky130JK$Գ@*A dQĜTiiQY"vAYl8m L>):+Kf֎H2hi')Rm #gR þqmM2H `QiS-IX.PXar 8]"q78k*&ћ񻉋 Sh,XT"&^d"hф3۵q~(G?7@fhV2'hՃz0S sESi[ˮ4-ۄ5)'\ټHELY;VSsRr(Aqn`(0(GzE@zX),jn O*w3[:=lnm u{تiu$ݑPwۀߦ -67xۮH6o-A~MawuJث!W.7-ΡMG Ͷj=v{}o@܆} 5jh]Cu=o{xNw4e -l=]Qw?畖r[m}C<a>ꞎ率 Ssy˦:Alm7wfY7Gb,m/ciT &[^k+`rӂF㴪:Lјzs8ɍ6QvnLQc)orKy9:<9&Xu0DjLEywFbHJ$,hr+6[ \>cMgKr?A+wՈd΂H:2禩w⃸hjJnN`Fb:B|Kjhqnc?W Z RBZ|15}"^@Hf&x;"{""kϜȚM}΁0*Z AKPZd;븕EڱڲSt;X~4<70cxrkcS`@=MBGEށn%9a6+BCRS &-p1!.Q~!g]N V9tpw*+(ۖ-Zঠ]L_~*٭,z/KK%J+V nEx }<دn&A@5w-ep?n4kT jܜ7tH4U|jB7t|1_1oojt'^LDF$˵b:Ղ%֊Qa^|_d|'TisQ>l`A1lTO# KyLȒy5(Eyt8. fM]J1R&&*-t:R9PCGs?1i1;8EYp] h2(,nbXXI:Üрsb97-_JwCxD4ǚ@zIL*Mƃ\Lg4x_gX-8R$e/g!P{hOGD͍?f,8ЫR&s-[y,/q#\iJDsT XbT `)xk"` hJO5Po%QOUFmKtm8^ 6S{Bu6۸.԰Z6<໛{۫a o ޘg[9{Lnľ|7pbe,&|ij_T ڲݙYJ;JBJ(}NxmZ;s#Cy]@r"XU3cx-w$oKU@%{-v9weLP4'"Uy 00r#!iieQ2Υ ܺI謭͂,225Ƅ̼ô҄Ҏņ8BgǏzK;obGj)Y{)[ַ_Kd+ej=gV0 #A&AHhh : wbNLL-bu.H`7ωc啋LX߃F:?<7 ]D9D̸%x$>LNsVf8])[%;-" 9,\.i TZm|aJ_4jX욑w o0Ե \k7hRrΣ|u˴b\K]-IVԌU֩Sl̒1}FYБG)X%B"hkIq2p!k=ܣ'(oJҠI&EF%.x# !E<$( #`pJ̃`LҮtbُR܈D`<D,6?ED^y"SΒL2fX+J˅#DIB,KuVRq j`DRIQ͓ 4i&4w|(pfN=tY+lfɱ( pŝ- d.#y!lTQڐV@3B03!N ڡӼS;K͎cP#k ~Ua]Np y?jc~];ɻ\$&nUAj,dtI9=~NgWtO$QGb&M&t"gW ;y&al2&R8!&MY{@L[Ai#cF9IiȌQOs6٢6Y([V٥| ~w<\Ox,ﯺ?2wX.BXLrbe@;\ptTupD0t`aH>%!LpU\&l-0 bq26mz]aC}g1v L>\v;n-#~3ktW^uqx5{ꏡ{$< Yl'a꿻,Q0Io8j¬}t2ox2hW!zofa bh_~i8w F߶gHooиWa7Xq4ǹW70SwtlitQ]cկZZὒ5ש&머2㓄!47+}3.0-gj9OCP7܌y]DUO LЩ}׫0#~ SO=ŮqM7 0~Cgzbt[qTsr KG]vw/__f[sqA!h<'ixu"BqV;W#Y4"Fu/1)hB.E/J~ dyDxCӫe0W mbf{ぉRpk%!͉cm.oޅ(Jisr>WmqkzOMh{9 *4[̨(].鱾7J&o W"Y(ο;FPrWZ {\os;]N i4Ei`}hy@V`ΠMH`œNYΊ Qfل #g!ܽ<*6vdO)3.jb w@]qa8 ]q0w-qm80+@)kJ/CQ s_{t's,پâ{՝ʒ{e֘djk:)x(1cVP'B#)3  ÕlwC 㸔؄ A2 /Qc(,<2Tȷ=4igymX;J-C4' [܉' !q|ip˰ܹe"CQIm(QZ t ';:Z9U1ʮ^&w0%Y,yŊMƋ HitL*pRafXT:~XFIZ6h`'>V{E0wOޚ6bTs'Ȕ9rL,{C %8WZ.rh4=ij @ -S|qI;FzIr`F#iZeEjW_P LQV>\I#J+֜Iy|'auVi9$-􇓴~I\XNôsl2Yan˱෷\MGşgFn? _QP >aQR2)Sz/s˫z/ULʆ#x}K M]S7]Z$?]uub(AX\V; ;:DK̞0-*z]118[oKtQ&Wƅg J|l0rfc41~çoMNd7ٰH4SU78KJB#~K0iS>JbbzXO P\ azf2 vpӛL^@sJ`LP1e2.E󴕯J颹F>SUň'r'TS)YzImLe+*ay5mY/UP i\zwSW@|A+ɸ8"" /?4Wx Lyҙ2i瞊RVԓ&=;L&w[ChLKAR^LzWv¥؈4D9!"*c$e<=TVjTtW K\ƥq[Yi]\x:qB sP!@DH (l !iK8>?dS^h􇬦w<"-+NA*4 1I[(]׮|[ 6<0Km*CA ƙARLrf iKu h@QKvnCkbC8ftD\#!b@W`8פR@RGTRJmɌ|,UDc2j7 -%u-u_@:@u n]*7,bWEM)m(Ҷ ?G_[{R'nKx$\ e0`zܾ nl3z": a*/>T^ RqXt-.S2`K}ie$XLjAeCu:zNk^*8ʝ7\ \cZ 1gz4‚SgZfm_n6qZ3³R`֖f]w׶ X  Lagߒ Oo-YlAII{֖ T:ckYڸ[|m|FyW{k;;-@Q!FhSP0q˱72քv(joFI*N/Tj4NQ)ıib~aZsZV:;,mU*! %'\f% y`8UF=׳`%,fFũO\ev fQ(JW|Hp/' GW\UR - B*p< 'U˗O'%''-ba{5rN9K1Ir|?z7^Xe}ҺGxʂV({EVh"oI0/@U*X9Db~8p;)X1N)Q⍾脢I10"#|_{]Vk u*%%դW:DEq_D.&..150Zkw~D=8 E8#LTo$vAc90՛.|tP-B*z-V)Gl*E|rDa$\U59E"/zZIM \5vS(ڣ^+>Bä'0n{zRs4>2X{~|J*T4&,1ViLIҘBCmDG. yyKc:K2:;nfsi}=䚂a M֘ *RRemkf, PXFFt`1Q bl^Ձ2 a:q@gM,D͔p"JFh&ȡj1[^-h(M>!&r1ȹM:tdBkZ 2Rl9G0Cwf.ey\{LǏ\OI#Eb]GmBLb!\[?s|7*&fb4&b]_B\d9xwZg4@+>g}(S:HqI_6C͇rϖ0T4nw|vnfg),T~~s)z<Ԋp(^{;K]ׄCSJ7짗/ w7@UG#K)iy mmAr:`[;|9<^eSnW\bauxF㬨ŲIvJtXiJJ*#"KʕMǿSznGoUsO{#fm$fqõOZ0׉=Ir<{r5ML;a^,sc̱r:ćibFd1=m«Y O eZC-tJÃ&Af %IJlz4.;o(<,(`֝:&i1+otUY$N p gXf6i8Fy8sHsӘU4ت3mGr٭72 ,47GP=:ώoV^lƕh .zN"uϕ 5[ڻs2;:r!NéH^>Fȯ̴6W %VU?G TW"GYuCJd$ up3b'9y5nHͮ.b*YLRLx)B+tuZgd=J)0+ǂY؛Qv&aTgi̺@*N&Zd1=F̺r%r\/%r\/%r\/%r\/%r\/%r\/˺KǗ xDIf_.K_.KYs\/Na.M`_.K_.K_ALJӤu {.եі[%-SٝES,B`Vsaz-d08mwІno[ҡԓU(bAsTF x EE) IQ>+mTD$#|h[K] Mf&h!Iʀב EJ&iRUbIUhM\ ފy<o.Bfאpd^10b?m2K@~=+]G? i-6Ҽ{{i'AsbUغ|ЀkQ0i9A@QgP?Z-WQPe==m}B(:"ZcVEf T-E6pobbJhsIڈ[NS msb :϶8#WiΕInG]Wq¦@:si૗S~9ڙ!0I(ַ`D`) XH PE9Vh0L&w~˨o~7V!~GF1hđwĭ(vݾ^l&N?!ɚmjtРUEH2YFv&\RAkz{T<0 *m'! qK]Q =)<[xy__SޛXX7qr/}߃.N}A&~7n0+.+hJ1ugJv%뻘#nM'I>v8LOgaH_uj:e0*a|лpy?_z+\}+N8s7B,J[3wQ!]C%' 1.RB|8*#?G ̮pUN}1Yy>[-;٩H0v;_32>[s3_"5j&[siW9.lv.fG&_UxNΰ4YzeELBm:GvTiS׳֎,J/ u|tG_fG?=hIr ͓<$j.nq1K?Nk]/1}\Ɗf𴽎 ~M=m=dMzy.p,Ț)(tļ+,)aXgrI8mT'WܜPrs.7(AcEn*$d!G$b>LqtR-u )sO2$Ӄ":(Z)F+L_)U+?Fj\q2C@zEn/ S?~`j*~`/@̀W_ *glW}yOF51vxCwұ/:L:):e!D0gāRO?)56P4cuQ+o$o"ӌcv&~HJRw}b*2pkb;;nsS7c1:KuiFPbnA(_11Y!լCp%꺼2r[L]˝a'K<>w,^FS-"C- I'KB(K삖JQ!Z X,3O-ijǰJY*%CG Mʪ6U&0ד%yuVH1j7\ I_(2q^a=ϑ2:MNArIBgO?'6)ho vzgߣxQeߖg[8HnBv̿őG_k])}j Ӗ fM+ZݦK^ R9A>Z61t ˑ>o?Fo:tH"tDDp6ꗮK_.:"-FB"YI`,q*x>6O!fnSNul`aҊ(ιXT5V@iLǝCVj\z-nʚy}@=4@Cdtd%]wEi"`bI;:[p1Q6l|tL1N |”yN"5HFBI. 8v!^^ Z~wYwkO5U+^ Ilc[fD{V!qvy9,-@{^Y!3U4S쫁gڥ$.Q(eN%ykDɵqWJٓaj Vs"&.39@&Bv76w[_}>p'qo~z;O <g'ʦS(JɖcJEkkD%+V%NJmk4f\› πcֲ~JT1aA 488Oaҿ_b'MN!^ OӼGD _mM|mjϡ#Tc/PL| Q ]b"!82g OD}Agc1bӓc kSA%ɕds2I ֖8[vX-lGz-lz6k{nn+2Qjn#i]^?]^|%MCIǬ  6+&RFf_FVlZB=JJ@ T55 &||Q3zSb."Z:N8[0y*VvkUc= ؽ45N+R IY7&K/E03m@m,DfȌ %2%6E4*+ GQZ¹lf]<g+8c`~` i#$zx,]< qr3h/z8l+k̒Nc&ӛduͯƠ+\֖B_/b8#]SdEquu(OhظPʵ]f@mD͵fenUq-p-9GT,8^r.1,;i*ڄխ畇xa>!usԪqUْ+.ҮdoHˤ4 `ddbaI"Ȼsw9\$X?\6]|9UPϲBw1J_u!1kmȲO(A _&YL Lb8HIٙ &%R2zHv9dv}Suo] --ENnj<0`#y{1D],h9JF`Jhb"e)_\Q6 ϟ`6 6Liד젴p )I:TU4RRZ4j9*2IbׅP&Zr.%B6ƌiBmƬ:1ZsrjURk!0c5'pFs(J<|7,WB1jnnCW+-ʤ$2 xS`y;Dƹ &Veȵ:[6(kF'̕Jz-# NTN{#EX+yH+3g]H\$-dY->m\'S$NX t>^NVe%VeD) XA(Ig6;V OOQk&uoZP xM5@qu~0BeBR qAaaOp9h\L2%680JLa6x!5TiOrCFkĒ+,t :*B#A%G^=aV+x!U6"0<9&f /4wT|T,i/)@z׈&ay=Պ!1*` NcJK œ84x؟`ZL|pjC6+.x 49aV˚:O@mmZ$L>bJ{a2ӪdpҫhR0 R&V0 ALj :B/Q%(  H&bZV"@T. U "VYѰ~{ ȋ ":t@2#&@܊`mmGDŽ"Yԏ/X?ON]iVP6S]%)( dhz6]o̟!N‚.&jHd`QDX-'G ]r<"P1> P>B(52 hP!QbޒUڜ׃bF4f,PZnk L 'B2985viEPALh4 <ij;#RF WGnP\pYT2N$Jhƚvn #,i`J fo^5 \YUi-=XWX@K039dE}W:V:)JP P Uu0_Xy{Znjj:.l6j:kk-Rƒk!nj!f=0P ->P;5,fo 5$A{y.j #d`09q=(_φp|Pk>sGk8n0JxKDM:39P.Z9$AQ.a2!R; #*|Dpz;`G`}F="ÚI^!|?YNPE1\ƒr\ \1DNFt-ϘV0 .TC!Ѣ!T(Jb&2⓪0tԣӷ]21b o|MImwm2~M)lL?tth_9ؿs1CZ9t̶sпsoWin@UJV *Z%Pj@UJV *Z%Pj@UJV *Z%Pj@UJV *Z%Pj@UJV *Z%Pj@UJV *J m5.t@YRV__ˬJ'+^#٭ctQְu4ƦJ(,H88`/X]Js@fzӰi|Kfb}XOF/wO8zzJns< <rńٿY^coC;<]CGja>]κ5LF?^nm{ '!67H>| e'(eh*K^b}2jUyW>hk!݀|-cHx˿՞o6[_4^>.WwŻ+ WiZ }^Ru.mEmK=~,\V>j)S5& ;I$I7Z| \rN!̗\Ois'Y,ƘqM6fxb!X͍?܍w;ey1+Se8:MWgܹ#;0 EX؝:utv»N?}G5?ѬL>)Bb>e9gt?JZwj?Z.l7:}k=Osz벞N |&>rW_ A9;ܞh'Mv?+\<"֬_/)a?ldLegeY&?~?FrG_}5* c4zLcc8#Ţ!|2%rϓ_P LϷw|E_M~fEܜH9]dX5 "%hp6QZw_[LFo:|pf,iT7M??3έNO˟_ud _ts!j{xhI'o{u雯?ݖsKsu8dt#Gq:Y帰H15Rx잋=q-zX1lmjlGa= 9鋣6TD ~##^?t |\ڷ{@ Www0fAnɥ^hػLu:)^QS (,0qŷ͇0] 7[ x^Hh$֫BwٕP[}R#7{jVg?Cj}F.`PlZo_4[D6Yya׌鼺Dd`VgO,–'hKrl7y7Rr~ߥ}>HH8tydWݭקq3\g;RlXmĭ[i?:E[y~r痋4ІI_RPl]O/Nq'\\Yő:~6=V#Y>T60^b3e HFJ<(xU֗O~ƍEtvEmU>],}KXƅW-jFn47ݎ~{U]V9zq18[Oz\6Eظg_m/-ӍQ[xӍ 1BJ[7/sGF\\jJ ph&8qK1^qYZMS]ۡx;5∀Щ=Ąæ4U"Ox%u )>aSo*핏o~ŕ>  0gbn+5"*{aՋ(tٱ^hRW%ѧz|ؓpu~XoƟ Yڶ;n(o֬5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o [CP5o ܆p , :zs0ZВ0/0L0{nmql42H\U1@#?v77_&6~J\S$áx[=CR$EJCrdq;-q=UUuNYSja}d#j5,v7ѢR~8φ?/W襰e7DWs$ᙑGBi/9``)9)?ЧߔoXұ2v!)`gi'ÉP \ K V>v ,./fIuRӥ^P,:&[UBZdArDi%X8I3tc=TYjȂyF^;3I <@^QhS,^[Yۈ?QiSbTs'H텏<zy%7xWN,"էT*/Mgijb`@ZN|)JDW*Ɂd!p- Jϲ"uH/+sfǔ ~>CR,RY2I3nLJ$ϛvz.}w/3 M XmlbשLڢtV=ZVfen0Ͷr$uu]bbş UJ73ŸgZlD2LC}H%?巪C`-D!Azp~C%-s(&8ErS@RGSD0%" ]sjL]TUOjkm/ u9W  TĄ ۻ+O0[w5׫dՒrSUJxV/LLwn۪^O tRw͵hKٿdzww6WV ́`dCs7I^vW a [3%L/w˔7$Cҙ2i灊/ϫ.R(ˊ5ј Ŗ-|u/e/ZF// EN}HhCCxSjx[Sµa"asuf"$,q/o+k)nixq4B; 2eH#36$PG!fKWAL;(|}cONg2qPQ3ꠃ~ejЃDz6ucЉ>uKTIS_4UKi&Mͤ45fLIS3ij&Mͤ45fLIS3ij&Mͤ45fLIS3ij&Mͤ45fLIS3ij&Mͤ45fLIS3ij&Mͤ45fLŒ DgD:ِjz.[XYN45d/4U ibYl^l~Pre3 Jίg UVyKm*CA&MJӤ Bf;`6<-GdmIْ+s;rF_R!|<ʍAzE²}0OC:%xpUAE!2@x` G)b\ӝuyA:?vqb(lmߎ ^t5zyJ )<( jg{O*!&دa'qn1c J roc7/~$v qүS2N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:3N:㤿zXMO,[JoN`\'0mrcPڽ"=|t-Zr_=\,$O@2bWPh>FG"G>^GlN"k)zGyiKDփJfcy.o)<  RҝcX ^Fst^)es DK[$5}VRbP}ֳY,Jfh,`V??;o5jɖZ.Yg8Vg70--5V6Cŧ 3 ed,3W-W,]~Rb`EYb-Tb8F{K2'5uF1ѵòs^qLN;0o:sRY& [r709e|h02g㳲T7O@Oŷo3l۬G%x{>019?uI8,k&,rdϒ+ROEt磩)Igj-mX-,ԿFm~w?Bꍉ#,^bUǗ˲Jy, Q"ꗖ)ZB \ 1vlQZ;k)Z'诟E锪ê/c8 77T 5ӓ30 {Z}뭟`$$|s$ g-".+&2K ~ȁAD0GN^PcXd"!\aȆ CXqbSH$|VJ@1HN,rpXMڅanTpΈ֖SV0pD8e8 !T]t"4D.;+tcxٍϦgɖfyx|GPtDI4@Bˆ**Ζ˓uI7R zd8.^r A*G*55(* 2Zd| ..2O(sOLo_n&[3)| Ys_ܘt2]U0֫=ex>/Fap}c+X&hՀ\2%YϋW &=YEsW{}WI.Ɠٹ9a!1/6?"?b/ 9U2AIiO 7lkթ胒H,a~"Vn6s$t}af*~ny(#%\.:w|lMyR {vπ e`SOl``[JRŨ+ v+(6~6>aOt`ث=] YB>iN\}:yc\*:;,!a.ke-e@L@yM>jb w幏`+i[iTV5gR=p2e0l6Yp6 pn}]P*m FU o)z6yM $μYhvi2E"0zB; ~x_gl6"y5^56Z Pⶺ7[6.[wou$Ƒ#8.6-,F>~p @.ơHe{^uxVU7]G-hC\Swk ,2Ue<47xyF&|ۋF'xz10.T}eG;7;ճU^hYc}4Q\&v_Kͣfc R*XH%$e`]|'ƬQz@,ER %EF^E J{\>ƐsUrtα'/"OFF&T;g{RN7}!qQ7rӯ ><7]윛hȳIJ%YMפ ^>UYR%wS%/e9^yRoEi)=`1k%'or=KZ<Ẻ].},:O>騸hWB &+rb!蒋dfӺɈC|mHHEJşRȵtLdR .3JV'5s;iL}8Wy=u{&5OO=OJ$֭`حnБ<:o'#{p#FȨmOWi]ʞOiʹ }U]7`V9_h6ʓ> Nې7F>^qhszONciYdWSBB5c*JuIS>RJ"'Z\T! D9cN#HcH"r_$M%"hy,5^n6{&jǴg$AIt}0GԚ5fDIE>IB1bOX6bQ&H6VHP -Ag+LLĆ!6i5CR~'8X2uɎ,댏-X94CA͖Ryw{$ɋ! O8ÙR+6m0lN.$sѐbіT|F :VbHԢetVEx$bC)$jr(Š(5XH͆selUf+l̨/;ݭz]\Vd |Vqy'7nb 8C^ [7#ؑv*| }lԦ.Cu0+cTZUDgW}2@\IT [fùbdzӼbj7eo+wAhtf^J Qdd"7dIɋ"N:lX1`a4E'MmM&"KTB`d2N5ypSV X,b3"Q-"&NKFeJXpViڗmY'TSAaߘRgoTv&昂!GNzM!%4&5͆sEUGe1:Iɮv6fovֈOE۬sBYJ2ZH$,Y*岳EX;[c;[IǮ5n7{&l\s6a#_pSҵ>Hg?>OGjѾ_pF~!螁f%҉-BHtgjcFG+IPv,I;h,Q`D ""i18h!sf@8IxUt 0;=DBV&~Q"e%fi@ 3{Qz>X[e5@3H#2N¯7̺^#Sd;}Uxq%OT'Oݫja߻tŗO?f] <=;_n{ӱZ+n6!:d|J~Wqݫ#y{v>Qgym›v}?/w}sur'^pqg<6Lû^$l_>UVo]፺1.^8PZJ8|;wu}YgjY?U~aNugYzǃ,WA_h23avٔ!i4{,nosC]V'FSn`bqFM"XB9†Jٍ}ʂ`ױn)d0Ʀ%(u>ZkAWNx{^%bUoQ /vN{]ܠƟ_q)ҷ*@#>im17FxpVMsتֿ+~k{ >Ơrj _[9uSM[9> G,t+*X5Z"TsCEa (䌢 j)վcoR5E"`$AZYI{b2% H_3 o`6_o vBL2LBTBM"RA_xp(cBb+~!;// Fd}b5=>kh1WJ-(졺KT<***뮏^g[O0K6ZWiWW 1Mx!0Z7QQg(06A=] B&%QH͆sOxͦ~s`p30[q{8.t7~ѳXA͵g& 3H7dT>]3Xe1!@+`Uf2{ ~`k wm@{뙜. NW>/C&&-ubo`ףV잨*ø***(*MqvB[fȮ6[TJSx#`<䓢T\фŵ>TZ[L<ʢXk7(F 3VH2J!b_y֨HpFl+zLӪsKY_ݞm-_p5&CgN.sBLV\5>7 [,I =WDAc k?*g !B֡$b d@iB'% #)e5AAP+2 DkɐUlI͆s&dc &<ڊײ-1\;rU \&OD^R) 䊎N˘EV>I-ZTB5ߞX"4?V$ $md_"DiT'ȬfTP^i@Hd^yBTP\$- 4ֳfù z޽crvt c+6t fO;:F@1|+0,`K{ 9k E&cL,b @3r:$hprej5Zb~&?$#+Bu:tNs"{Z4+T謱% !:2Hno8Qed5w#LP^8:E$3 ED2J,DmF%{mk]N}EQ3-W9X2xC b&Ydْ)h4([BګG;XQ;l&!ji{ ^ĚpC%i7<wLH'r>F0swK\ؘaR +4h` x ?{e:?g njѯn8 ]9 Cx,漧N?}Oק%b$V'lr$%0f9)h}R$D'ж'uiK {=$ 0 H2T(?{Ƒe ᯛoUzf3`Ae2APϘDiI^g1}O5R"-,%!XRwy>ιV&qQDc9nqoWW:D~uުm3|&frjtfٻb۟jl ]q)G)h" Q )TJ]A=S'Bl ).k^vY"(QsUCeDȚ3#S/MwJ5p:vB5Hs wQeO[vT"́uj&JZRFձ %q/i-/7p=w,ݾ8Ņ6q0TS!/ c?%<`fvbxy|τ\}u{s("XÜ:Lq6WŪTQWMչQxOd <BUAVKZgEmG&4pխ elLPIkeSy.<2COf~!@zįg~-esJKCR5A+|ZN͆1[iEWC"eT$*s =(MʙmEjH/>PtBfqHc48Ǎx"^ZGL:0e9h{!>txb67EuSۜ3w}&|ij/ifHJ95iY;#mީY#g18>(ҫ@_N-yq1Nm>58"(Nl KJ j,1Z0B;rb`/^5*+_C8OX@tp28"UZQ DGD6W)RWr`+GdLtU S"%lՙ d4q>@U*_b7FPb։Vr6B/7ɩ,SIr.7.7/L8EaKɚdgT+ MPDIQ|a@P;Ygc{7 jl{~vq6/< g@rt$0-5Ԙ?5B`V@j@ԁ0Z{J(=CgTVOGLM]nFbߍ'o:8PbIٟHASUzY·|/?^'Y)0`%3(*썓vfG[1b粃^В5toû4ȾΩ{ݯn<\h(p^O}n ʛW'JխEq J*@}iD_ v0` @%;j 䪱#䒒ƦGImWy(W/Z ?zĥ5[[?2 gd2$g!]Hdd)TIEK)} V\nu[RMgx`fݚ~ NG>ߵ38jn[Io\\Տ1X(:v.XH6#5.k!9{>lElQo@Y'w)i[M4 6^ JQ1贫d*F*5qkRB)FWE'|)9P1'P/@jUE1x&Ή'OEmZ쨕DAL*3J%E5ëʼnnx/OBh}MtJ0Mz*zSzlY&@oʅo5A޴17^zss꽽ٿOvx[k鯽{_޼EY'ׇv6wk):|.[_}ȋׇ+Пտ9gڞ']mel㊒tt5mʇ&>WS`#_W).|v F;VQj_OVJO8)NEY.z )H־eKҡO.zJqj…~_._ ^,}wB׏} ?^U'oMnvz߰v|~d;]4m`3P6>.9o&+q3E+2wD qE7){?O?zc ]jfii{fjqs +q(C)}3fOEkeHK]"pD;4n.jFyx`rai⛒ _YCjn]NI[n-NL\Zk{T:ɪ5CY8UEmP-_PeN;v~@ T8iw$n0=k~`>bkF?|($RPJjW hA,0ѳQJ*#ո LA1n9חq]3>3o%~ݑvST5׻8o۵tnmjȋjj-EtehkXdہ{w9Z/c,\Q}](BRZU"' 1fWKل[1}E.pg?³Ҳwc3cL]?WͬTХZN9Jel$YβVHJ˾jBTMB=r.! c0V)6cVGߨjDǂ'*C c5#D8̞*S@EPh/n5,n6tʶ1HT&$$Y0?GDƹ VUȵ:[65hщ0\M=#7B8bzrړ:0_!M&g]HB&vZ%} ڸ'S~Z'OMlC1Ɵ7EͪTbe#sHTPT?GQhSUQ5Nz$ՏXލǨ`|HZjzLa8렰'a5.&rL'%U2lhuT*} AcF yPXtAڳD!ёPjsAtQTmSXH 2.MV^ 4c 6"iYED' R+#\[ld*.U`;0pZ'eGU$ʁc;= iɭ+eNmQD T;z!aa=d4ąeXCѬc-9&eU\`VA!8ɘ ̒JA4̹HNm8eVC@Plh C 2!a^8EVdu:VI E &NwlP`@J;g$ ?c:tPXS"zU|JXLՀ\I1#~h3.@Ơ) y0޶]CP) 6 ?bF50tl/p Zţ| p`RP 3͎,x7UD&0/4R2Ah*3j" J8 9Ӄ}aQ{a䡥l Nrr!R]Pl-AץVݣ.%eڱI5")X^,.zjjea@H/G* DPP(%6 *i,`qs޾] sP$\ XR0I푈w@†:ppC*.p /룫FUuDh e@ R%8@zT̨ƈZQJ+lypI hN%e#s4Pqj9veEPQL,(QKF|d)e8bzchJa<9E@ EJ! c=H>+Nhƚv 7]V =, `k3KoQ5 \)U6EaUٻ6n,WX|x'-~Q2g6UÎ.4-s- IMRm^n!d&1D|8߁:Mwᖂ `d}A|^6n8^B I!ѵҩ|j[@h?dsu<5K۴h6WI` PMq(X3 R`5F1k(ж#>fAk`XgqKY$玂* FpA0Fe\8fmH vf1[Pp wI <U%\xsȇ< 34.KH5tR.H܃CFB m^Zc`Oa"RAAVB׭׋ɰXy V _aDEڢ jc"'`~ .E^=!6qy1,E<*&#֋R+r l] xC3_У%r 9Al̵͉H 9V^()"yohM=AS'@89@{LO`y4sr&%˽ #ptAŻv 0pw0[@$`kt^;qwB|l( !(B B5۳h498B]LxgLLH70iLe┾yS}1qAW*e,+[zϊ;3P:&~DSgAPb!1`@]:`L#2 T @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 l@`|( |@wprL(%zL fTB&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d=_&CbAEN(i{&8O"92$ @B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 |@A~HL <25 R*L|j@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 @G^05NCp@oB1z8]5$#bz:q sB(bf#9zX~ 9˞ çiv!U 2gC3hO1#jvHU9~4$jX<Ϊ q0B/k7)eVa:!q&Ɖl:([J"v൅yV"Z rV൵kv ^"G02(T Qzk:KDo}%,PQBHlѯ4X2 6Uvn{y}b^Bl8+\ WmSqv8m`@43Ö́, 2[ɸО! Eu*bST#6%f a*Aª=mz Ȇ$8!Ryqh+k$=56U`tzkmDM'R]n\W[EɈ)KM8m$EN#( V9BysZ( [aηEOgo<`w6v?FWueyO٨,{uƼfr=`[,-2 ȩϜ$,fx_zM^Dz*šVu"RM?tf4VlE2Kh~UO7:N j:&wC4zzY/jbJˆ)'mB%iqLDuob3uV{ۆ0Պ?UǴ&نSVfB9zq&bJɽY%ifu$zcֻ:j3az(1lxkiaޖ|~C9 RRzQw;-k~3mԩ9I{)H %sE4g@cꯗRWc铡~:v&M3$ޗsɅϞm2տ2WT 83`.5O?|KOO7җ ɇ͆VS.c_.O(9#%O񖯟6U,+|8]JKӕxkADuƓ LsgA$`N B1*y[ݎ62T-@D%Ul&/VhP͘6% U) o$0ʧ1bFb״ɉ4Jo(N)$hU*}١EXCw83OơARITFXE b$״ic%CSZsZ xSԍH$&^M1NoLI!5H% m8iP7[`Jǒh,'|9 4Ugұ(J U4+ma5mr![*g E-bZgX)gۡzpR; ՚"X͞nGrMd,]6zӛf;;'ceC߂>Wb}Eh1kdLS\ K̤[y*.m>ew䊥QrRg|Uø(܎j ??8H:pia SL\|x'%0KVV,:#eq bxf#X1eᾆ|xc#cF14bgFlѤt5TP`K4-Fr]f+5, i2*B~4.aNgAji'Lle+57;/c8l'|Ikn%n%YINg;݊tؘSJ_goc7 vl6+| /l$'UOՙv`/C7{߫vOj\5?7;^E҃ҫl,~}wl~IoCǡ^:̻ޙa^]0H]zt%f{?}}Ǽȏ[/jq[:x9eߨԕxB,lՑGC8MxQƿf(ȾzӛQVkr_.Fy3~~ܚxd]c`sIA4T,:y]E%/UzolQ\߿5yq|>3PZrL_x:F㴬|<B+a4jNX{f16Qm6'l#hi̎UjU&;YmsaX,m$X)a7>Dtd'. fj`FakQ*mךùթС0JEL+A:V;ѕІjN_(-,H^?/?ٙYG;_Oam֢~H'|M|@rR0KύVHYpjD`d&n mMNRiZ֓|GqhxCIwPcwѠTT J⥥JrSJ* s6Ck4r'IBm33O|Te\qrL>.6-ɳHi@EË%OE 9ym:&8 >ZaI7?W= dVZfKbVJ^.%IWOI0UFE:\gCrȾw u—ip@67p?qc8\gze2:QR/b8H!K($|,)*3Zަ_ZI_d~i~6>YտW,Vn'W9vBVK7z/u)DvVRCw/MIT ?(*2iJn$ed}L"+n5~呵4Zh4R /UmF|Q˙s jliUCDKJFv\)s Ō4*',ϥBnuc8[]2GRfU%MK^"_ٹM9Mek46IKixGUHF/oQ;w 1 a%qcs=h-©x?!.ĴR\Sma{XTڵ /;wu3%53Kfrg s&ޑ`Hdr3OGYReST.d2’R4^9\4{EouhwE،瑚1Mo>jZh"c/*WJku;V(9Um~௙j ?a]Z1clzgbyξU; ɬbFosǛ?seb.PDs yQa r- wTz,y=x}a|9iYy"u,[*ǹ:2B2 Dk_/=v&/zble6ckviv5]g&Шj8Eo+v.#DYhZfA ~ nriMw9Idop H}7.]FoqRD> d X%N钖bz2z0sz6RUmS0u%g̒竾{Kߖ*աo$ufFQF!J2#yw5e4w[~))᱊~l6aŧ4&Zنw,*wղ&3Kc13Ÿ6cgAj k*쮓B hZ%8ԻRɺU2*ٕ߭ B2;Pit}ǩ*/+7ㅛjg|ޭfmI]5n+P<&UNxVL*I΃&YC<ݙj!;ˊP&ȇ"5Y&CV(Ix$guiPݷ=}m@NIxz1I4$Q\Žgʚ_IiT`<Tr0rf3_cCfsn7|7.%ӸPWOR;#髗AAHq<DkHz%KƜזH!{yNU^\v+_E[ZK9YYd-9b 1dYQZI=gR|b2:T e}dM( e!5FQ>'cbb:Ĉ1%YOI= .aZsM 8NA3FtyLrDSecW.Oȇhڃ,PKMw&O-B>dg47\ ŮE|0 dŜ0CԽ@V'ՒQZE%SAۯ yὶYlh:5-MCYk%pܒs47j6kӫi:T1EiuB)6;y7Q6<91@& 41E=bV<@tt!Ƹ҂ HM8.wI xP"r0ҔAd@#`J؈!-?BxS7rੵ,`|z`y0]IK#sn3dԠkՔ0o[Yꖼ)><*W3lskcLc0gU[ⓚ1PrD<:= 32#ȽYe .w .,͉OywCYe'<)׽S(Jb֣Ǜ ^dW9U\ՁSa<9@*9Ce ).$YP<E"6f{)nͿ'_~薮os^P1[\^9bp5K K*ȈlhΦ8mnNT?ƬO01NrF| A,%\shMA#ʐYd}{q!H ]{>֤Ȋ#)XlS|s}'$04{Q)ecʈﴎ8TCc9fg?DR04S77@egcő~s}€ҔzomhXѻ=}ԩNUӡ`s/#rAiP]"2RHi߲~ݼG`R~) 6wdAi״tws<ۺqd$7]^>o2kHp)B%S;Hw.K)|h΢6΅H a Sx_Ɓqowdž-ccN~i\b5!arzg]t]L0m8 Wp~ࡈc 2+Ё_~HfE;רhfh\U_rwsK=gF忒,m_c!?ݣovv7?促?/m$0ED) d\1dŜ%F_o#G͛W03 ;+>arlQVD| <k tHM#θ%)Er/bRXڸ0@9 G? ʃQDQ٠\3vP-q D EєEB{(dH)H-߂dx){L4d)) $ S&{[cK6R  'YoA/n9T%<H!/KL?a)Y.#[P ;=+qZE+%!و't> O apk0X'2M Y+{͒gnrq2^~:CFE|$5j:#=9qpp=NeМKUk^ Z'lK-faFLT Zum6ѷdwh1v,u$y8kiYu"3\%HEB&D@PǟdQfr:UΌUܙCJ" ^ʘ>J,~$RQ5Yshf$IxJ^>j}wkפ:x:Λ;6 ǛN_?d+,v&yell}!h^rR:5 SvX+4ode8Fc5dݥlb>JU1Ye&ɖb gXk+JcTw5^k\%Ewzh@rmІ~_hkX(@ *X^Ub%#j9E:E/kΈփHB.f`Qya#13^sNɚgXw}S:*ָN#1_# 8qcA=cZ!Ye`,977!Lӵm*|10۴>obqC7j*)MA}n1ovh 47ezzojKnrsΙ-qS}9QA+՜(R Qy7|h~^.- #]Y2O)T(NRQ9uc~݅Nul~x4l\7/UQ?L*[>PX˶p፩j%G]k=)3YcO#ZcgU>eJڰ74;(juzGS1~DW^Q4FkDPrm݁$ #W|$5K4eh Q@Ug?>%X֍3:V' w4"o [Q8U]OիH:km[cJ7UFsiDО` {2mNR )L\IO*!R4_ɧ''ͥ"]Nǜk6G)NURrrJp]8i=FSNRHs}*#)D,"at_'d%g\SLH' T$%\Cd͆n=ߠzVƋXZqLډ\in[-mLJ6 =@\)-cmEn Jb `R}ayHWHbH/|`.hCX6̞Mo]:$JP:Ѡj4&ږm`v%a$KiD~c߱Ǡ#q8ɛb=l>nB4;mK=F+%nTR;2ki-#a(M<\8 k4f~nUNO/^|aH-lk=wB(dE؜+xȟ<$pɼ_ ,~pKՆY7=nGG,zzc?\EՑNQ5Q}\m7h2 Řy6}Y`\CJkNR} '3aơzj' 5 -"'X ^-*sFD2Ia 9RL1J #|G霫82cWmhir{fE7lY-鋴Lcf#EObOttMj i 쥘4=O;H̗wm.5J6>ڛ$:P1D(5HQh$Z,&Hf> cJ%P&Z(/=)0q!2|3x_D^XBV:uӫa;5b"!q2摚_oUՓ^cPw_lD9q(CR_q2& 5FbRh1CL 9CӎK6xq*MqwբFY4Њ5Fw7YX2۶~¢$B;J9 cf-:f=!b"5,XAbbOP/vk 7X;WiUZ5ը2[ mdluK1 U)Þ+$5K8^ꓵFi0Iw4|O5Vk,a%B.k5+oVxz=R?F%^=XXch pZ=^gSRwEi%w:J*V1v I쾋W^%[qq>܌&reR#CԛU|b+ po|XŔCbsҦƇL~s,ইL9{BO^~M;m_9:_Vi Sj]t/!(B6 ToE- |Ԣ,HqԚ,D!m~ުOu,A8D됿.\FJ{92SߞJȩ{hhASinٮGC8ȏTz_93#TuW\j =4| ]1OFv(+`r֡K:^Fߒu!xyXaV/zRy\nm\ ܵRc `u4(:9(*$CԾX*=ώ+me[ @aJMJc=VSs@)L5tn\EJ/XSܬɮ4ps[pFRzZ5M4pNh(q( W'ϗw3' PE [gI~CJI̺G#\4P_eÄxޕ#b$V\s3<]J:Z,Kr=[-J"ْp,hV**•W43g~صu4V1CpՅ MG7?Xs3 >΍'A34qkG͡d)<^1 r2{0AA򮌁pN‚2 K흑;f`r&ÀJPB 64MA1!2R!~*̷L|ɝO(`>)=_OG",|zsB|~U["a)8u"\]mYXm/QHW6-Bj]J9`V@3/4~jS޼$8S\cvV/|0Hefa~8޷ fM S(E ^VK%(q frLuC(0:Џ<=*Y ۵od& f/ʜSYqklgR%h@:5ݢj0AGeS.7N5"0`VC"6d~ͲXYKeij.MKR1'YGǩypGmrk>NV۷POG i;D9en!a[R, 2j=}®I"y)wcڠy|4 N(=?C)0́YS]&)mйMqCTh՝!I [m'1%&93 %+5ϩ !]wF[!; k yXJ“ qU}Fe)xJҢ@;}GH%7[_MVCdJ-h]#MNpjt,7c}:*@: nC{`O>$}L4!\NgDD%$S2#9XꎩOA)-sBuJ.r-[ت^h 7QNh.eNEljht?5&ϐl釱NȮŽ՘q,qVޛȿ&KEui6qZ`ilrTY dIR&)rOOOBB"36a\\Ūa[WSsߴ,eA)NRC5>Bϳ}/K('Imp\:h4 Ē)M2d(f5E8Nyj8V 0`;VRceQ|&uaӎ+Y ;at$eVM=d2 J"'FQ(g++ThIǹ6~q4Sr/?GMD(?AS $0ΗN5L^աJ k|qvG:ToJI5qowbBd qRKRi!rMZv;v_I˷ם +XVc>ZdZY.oש@/Ĝϧ=5C, EwNMolZn/y.3~v~7UYѠnaWuhgq*S .dE Ҷ "ly @í\.Z;!t@uQx$f} ^.ơ>{=hH"EDs:A Q BrX\[^R*qve2NwcE0.xVmM¹.w5̔[nGDE09_>z27nEthXzpEl 2j}N>: +4OV6sS %Q"X=학ՙ/ }g{4Ifr6mшC5ր$ň#e@e\NYf^D&sΔ,T̛P[ VCww!Nl(⏔110Ѡ[E0X7he4L`yMG >n!,t\oH8& 8 3gaQC'Y\AZBx4Ja&L㶦1 xQYdfZxݖ52%R!UR/WDQ{j0iAqQ D9Af%Jr+b(І5t% %u)2K45`mN`mUpFpRyι9g F, sx kX, ٺM5[;LRRq:Οr/hXnmrlI`:. T dŗOO ^:+ h'{NRlrwӲ;XІwCFF9 JfK/X8b+C͇`[CVۋs ] ժ feBKD5Xc4C5 -`4+ɭ9"Q\g*4/|C-E/C=Q NgLa[]!(plwgL"Dlo[}J57f|L/QĞ1a~)E HJqW [qÌ {@x?ŻV)y3;Iq K>׸Ӟ$%\x0!dW+߿?^sЗHjyV*Dc/@C6f8<ZP6Cj ޢX%KBe,s4C"#V+jfD*JfճO2]c,cv>Sh8[7@KFSҞ3 "Q؋OY@7Z8#qyJh}x乐ס vrpip"?Nuo_ (\ A|ǔ2m_C‘cO*vۥ>=pO+BXH] w0 K^<"~ZKM4C<БsuO ZvmX z2/{o̓09 ;B"UR=U sb5$Yqkl^tU^ i+h!~V||,Vv03nEKDөbд}YSCg( 6~ 4e̸)ѪKXUr޿O_^=beLM TRz f GQ^ѻWJLENWSeO ׊WV!ѳU} xQ D8:o (Ȟі> .hng*h&2W4nΙx՚zϋgu:/qBtWYBz=hA=+wWdPr&;yu5Pn*,2pr¦n5BC hB׎׺Bxp5Y@xf@y.rt_ 7aw!a؅oNI& U%Y Wqt*-< ذ#iW4۰Ʒtz;pzwI.IkaR6kűw^)Sao,*Μڨ:@H5Qbԙ  : q}껤`R ]ڷN+l2EKa+NlK[NGsu.Rx89sU§:ൟg~N)Lc"v0D /bts֏uEI[߶_ c`(4w90> >%V5#N&n#)2'/H}% 0|zR"`L}~_px%1#xy}Wp +{(}y  ++&<jDQnƮk'[Xpea,CDOg:f!pQ>Ϋy`a,I(#4:D?\߯B n L̊x$=#l\,rh:TÊQuQ߿Y;]蛕Q&w&k] 9XܷOj#.;b"blzwHؘc`Ck8Mj_)kcMI?vGZK NwM* ޘnA2û$LHA9mrZ)p[dm)MS"9tX32њu_ɨylJ)0ܝԇѧѿ^ɰ-H)eAA>/(1~2|dl6Fo.8G|RzŌI:p);ʉ 9ЗHjyV@=T+XR7V2{etRx#aMϔ{gUA™OX|hy:݌$ @_NGpZhÛ伃b {xe~_-^?(n׸LF73(*t󜬲Md^?'YE*H~[NfhЅ/h 6']rvAw*+] 9(i1% S% Ye$,-VUe#T%I7z? Rfy7zIn_.oFZ7s/Vyz5cj١֢ϣf{/~~+YtUXՂ_D7ȉ9׻ԁt΀8eV!arC635'JzQY1k.PqerYLSEŸX'IVwǧ-Ńv0Ǧ< #F3ʚ> TUG<3BXfTСjaV* 4ojRC t$CW{FZ4pywD)qSRPμE#UHPT?(Yn/ʹqK1I*J҄ &IQ@ոˌ&E:; y9{c^N8x*)HyW@x;b4?|gw0Jf$U{ӊq(Y" Ƃ"HK8Ϡa`b9Ṑ_8s՞0RJ:P{y < q4yeX?+>[:x/I"#/,OzAJĺXGJ{Xx\A57KX JVZNOVcEqA_GaBY'gCjf?ύ=[jnfqje?/?~)$ M{F$S# X3;/3MQ>o$HRYEٲ-ȸ#2##<jXN)ɩ67G%O;߻wӒ%1UQp4:z}< ZM]Rj)D~SX*ct,`=Sԁrf*0er@eIl 2"5f+fY%k stw Y##L\T0EP YgkB0TK.dYT IVtn4gNrH_ƅy7_ǯgUny"1qP̳=څA)2vӓQ^S00>kg8Y)MPJώ9:{*3.V-K2JtD;gb!*W` %n෸H:ɨ,CLF^fp 0`Weoټ?jQ)2ąXPFQ$"f#p膍:%*i;m`Zy>cu֘;Q*L%Yڨug=Fا}ٶ0$)t2[+`e$9O I.7k& ?F}y5qo@1oc^"e8:3ZCk댫ԊaSIZ%KwGgz9|ЖsR X4%ϵQzLm@2 YB8jh1H2=lѼTJe);R@a*(X{Yd _ oR/6Y20Fg.i`ʨhҁ_a6Ut:9wO;5*}\;4+Gcō\#Yp{tY߃r2(7TN>by2Fz:a"`|=kx}7~q,]YB8?rLJLqy:Jo|b'iyt8qW7ԣ.j\hri?zY=5p9jkCҐPۊL dI ][zurт.7α< `oZB-=33̂Nˁ[2Ch2VeOxpo!EàŁo:oc⚗Zbǔb{[ZK> ycZy.HAD; A7N{Clp|}t TE;,qc穫"?Clm oTj9ren;pLe^m0a,ȝVd X1^隿7؏{%⦙1:'Btx]OpVjVDE3*еT.%"BbMfl˗fK8vToy٣|k0s-ȇ:IKwWi4!QAKD^ |7?߸cmQ9?Js u )x~t-ڰ@-,N u/ >m7v /Tъ5U Vd?53tcۨLkw23پLofa2$Za5TkwsPR.;1NO{;nn;M`hMWHcSyL6<C%ʔp>re=>Vz..?-wDڳ*Zrˠ{kZAlCc['b2m(ӧ R+@B͊UX?ҎRDt^8E &T4 tTg-syU4u'[~^ *(́HɻALl- 6\JKAP%PH^ fvq&fj;dUUzת{T(4Oc`Z h CWm TҬ< #iԻ!͂EP`hzq@b)o4hUu3HZ[]CRdʲQab"@*t_ _pkɭKQ T$95Sh($S@D海<-UyHN!zor.GX62$N`LC1};~:z eD:^FfF+ki [xNclkۅ6w0oW[ksC֞DRɪkӍc2y)htdS8)3[ᢈGEF+0jTU,2;a*bB)z[PeNb2^b)=C quj'Cc[݌L}U5>w`Kj|GcdV,CIp jrK~>S7% fFXlдeS1cR.yh*tbVGɸV9tmmQ9ͳ4E_?u{!eߏ>˿W除/hgu;]30=s#e qA%8PjD&xT8BМFcl11Vf>5iV%fFȾ3TNqK?k-9z+ʈROzCE#OK"x Srƫ(vfU[CjZEXaW6- xsQwU:I]1ю hy&X,6`l&7 *2F+J#Ӿ$PQBx.`@@ 0k3cy `Y&tf5Hld4OKmmU5ѺFg)sZ6*#ϞdE*d* = 67h kᘡ[o\v=K)~!ކC""3MIsX6L!_vx(0rej - WPrI:hK{q T:H?$gĊ֘Emv#Yx7Y#F FFO|7XʋV߈9[hcw3 rRθ>ФYn:v+[` cGyDqH5gS[ Ԅ ǂѥd3-0ֽ˓1h5D=\A3Pm%?Ө~ j CV٥x#NY7N-0D/ǕJkB B..&a`P83m8OͺRBg_ժFŨ"K=0![텰U;-0~-vaV1W} ^GP'8f6TPtף6( +솽dހbUJ;iH*" Vx! 1mЉ;H wV[Fbb/UAL5 qН`TS.ijD cR@$6TjPJ+)ЏǛ {5 Wuwl"!AU 0N)X!Lb&#=XprWo"CTGUNྞp(oHneqAVH0j^LsqMbd!HfK (%H𳋨pȺr}z)y=(kF&Ą'Ȅº$AaKDӔ4 w< 2~bOjqγ򧧨WeCшoO\(7X)JA0HQ`DVU~\F݊bkaNx XBNhg6XATe1%)UD3|-!,/4;UP;}[UvH61 #PS&,Ɉϐ-Nu"+#ья Oa)\U#!:6^0SFX|BEd5aw GdfQ"S \^(dFa8Nח<͋ZyE2Qi4ҲV5 ֌wRk4yrg;l|dQi}CGcD.NBB1̤Ti0KdwR|bb:V+OQ.ݟ}$OCɓ,D`, \P CҰ3#$nzBPc(ϋXpq)jAn0?~VQ+%" LuU& 錃dPʄqT"Y8'(#G0H]ĕ,:$l]*5h44|_oshکS+*xitNN2 j*Nȑ 8d  rq#py%Eݦ9Xgq!C%ru&tY\Ȭ"ŝvY#'Uf &SZQDLG&"DdD"Sq'Ca`_cPLvZVBT=A+No~DNhPps +ݳ`Dl<i-"fgϾ87wI6O6 9W.k%LMkNƫ":\yjͰӂN8am*-TqDJ6^o&pJd2 +obͅ,;CY.#SN~ٌ`tW47\^7^uFMR%Eh2Aٴ .V" T3 .M:>pԖp!(1̠afbX ͯooG`uU.0Ҝn%C*8%2zyHL Ӻ_yH ]FT)wH9)ѫ3_cqݍat ߗ;7jO4LᆱGLQ%ںEVѕ,f"aL8b.dP[^XeE{q^eY!Up'N'L~+p:|8$/@uh?dr[EKuωCڡz ,1߰ JO\I f 2KĥPI.dž0fhk=!t?dOås?7N:?wa܈N,#CQT2'G|qUD!NPq H#!'=r9JWowwLJRNrHas8]aL87&| c;3KQ]ɞ(6N4qj_"Ff 栏_{9nQy&b/E,W>؞l4y0ao9Ś`/S0 eI^4G^_lAϩXVU\*mS!_-Cky);dCpg)ZMmqi><i$1LGpaqK.`0^;,X]:q}5&T90UjBkJtX772f97.+RW\x7^c " '4ͣ`n;ŋJdQ`a ,W9ҵ)8a^~8j84T"u,#X╯}Vy//Pez%h SJdsr!z1z8\ w.+B p"K\=)yhN١=I D9o)Ll)̴BvƼ3x8|nE^9LoѢ"YU2y'a2ILdJ(4qOqq>Vx:`O|_>?₎x38`³K8"yu ,-?,f3YMwc[lpٖ0x"vb0n_n.0o~sr=cj0FUaǣw3)Y$|Zk9&1ː}n3R™gi]bhKXywm ]!t u3˟u}8^L5,P XDaH*cj~/1Q:{P\݋ z3p=v$:GcdںU<)bk<⨫(x ;<N>C@O1O|񍅑`,75R,:#S4 _STttey(.qɥj\þ3hf䴂F'bCK\F+NozfLw׃1I.b 싍\鲱}$Cz"ȾDC?dj87ŞoV%D=HNޭ'\i6žpy0~gd.)spIMjz~ 5eir!|8X,TA"  !BEH"3e" (Sq͌6XÈE*V^S ~ ;8HJv `ۚ{SuHHN8IW#Ǔp =u<:8p;#Es5OR=_M"> hkTÞuhP4m2OV2*<7X Tx[x"vyL,xC&ڴ)SBLk#ϫi$]M^zFL]n/q[jO ^sp5WO8 FjϘp3 b=j ׫U0:m;mKhwH騔3zϐRYՉlOe&a8"qaN%2\ FR ,é5*XG\.xd^'5M%ކUwmyN[+B1::ݞ#Z!EsA+3LR΢D*&poy8d$ht1^W`WG\Jdc!-D {NN΍U]Uh(qtBЄpu%/uAKͩJe8*qi "zGl\o`{ް|2 t8c^MI_=^bJhZ(vo? .]}NTv!{0i*XRvGfٙ+Z`e' x#ppD1Kd]DS.ُ-`S%өٶfMk|ebe@d8; {\vw'1-/ŁpNN-Dla&%2U;r[BF'7S uFٴY|M&͖1|VI?~?5 iJ;uD9hF(q,v)4> w0]iIO)M#[;>q'eQ ZԷ>P5 CiVQk+AY6}mG4a|6qPp"?97O2_zP!w$#`L4}?x҆bBJy3;FY#ۘh0=͛+\ K,(00V@~s46L/ۘHN_qj O@Ņq_-@DoL?A\= f4Mgti8H Ya:C#YM A~Cx`˷Kg ׿8Vs!n&J=\<ʛٱlhZ4rVrEɩA тICneӾ81|@o$x3M]H~3nY"dy~rb|## |YGmן4Ͽ~r'w8xNE9 ӫ՟(Z$Ei|O*xzXIO Z7~/c P9^2}bͩ2j$AMm-TeOM-+.;YVJIT OSN~&F|%C*w#(G>ݷ׫M B7|Ϫ7glѾNǹJʧϨMH9Iݶh )DFʤㄒ[N~ԟK fM7!5p9rڻNF+Bhq ÉyxtYcno/@LJmӮ$|Z d,+bq@CwS2 H%Yp6x^m6SƊϓ;quӇKHV3k^("NEqG+G`Cz+N[c'k;皷 =i7}gqWt#9}ТӅDƻgжzM1v)aRi+C'/wɣtam~u&9q||>qN  $ =Dd%J2!X+P9/'~x<ذFsO OZzvJ5y^ͺo@3+U\8Пnà~߅ٖ6^U&0vmFX: vEoLXBZgIM?În9ltףq'Uᰳjl1|9O1;|9FO1S Y 2r)z{_.|]m/Iy_wV\=Ω'k(ˈӉβi|t2(x@"{w=d6Qě+^r㰬 ǻ#G툀vD AIo d&yUSpXTFW"@8gͯc1vhY8mkC_G9߀yc[nX[cqcx4wn3ӷ;FZ/40Jܵˣ/b`ˣg񷃉hNN}b0?N8[Ua?zV5X"jۂQhZ {׌!$(srO3*&䬣L*duASUQF wÝ= d /#Wo'<6߫s|+kIc5|9l457!sډMji1~ug1Pj=b<ș\-ŰOLM*sLAhfGBPcE1K5D0f)=hCARKa䲅 #@>蒩o 39ޝc,=A] ;jks拶9a] YO03B7onia@::#Wyb}@ ڥB++^"2\ԘlAd:L]twtÛfu8{ 519 lL*jDDEؼegXPVlS 7oV1$2-_%x <[6lF%ͻ=ՌZk˭>4 B2qh%juZK&7Rg18 EWdmK㜉)Ah;xygaCJyoZpU eRWRɹ,y(eF#Җ|rlb̵xeg;bacJH$Ey81ԟܗ`IZ̘?\_6>RN:#Onfty3:G,+@w{@ wiFr0yI7A_zsqk\E"qEnf+skjSýiW2 =9C"1}2/ղv:D͍~) &#! V'aLji0$C7v4~ˣ!*wʣ/?q/&w|$M޲ubF /vaǟ{ȶawwVo)OxH {z4? lk)ggG?=?>Yw.&Ro/nbb~*2g?3%3RW*RJienX1Z52n1L_- n`[ʋl`Eo)bN.(>+>(gLyfh{h1Z zenѾB5_9|pƘSյ|T>>q_lhP6\|R~RNmt#[SŌ2N{gʼnB !˩$6_2y{l~z1;Ac*%[,u؞^Л:G#[zcO{zi4ۃwgH|ǟ 19X)A68oý4I? xT[B$-$̄)G/7{Lwܬ'+M{SH=hny5xF+Z4 yBpS<3#N#.D[,<炸1Gs<'N']^2ymd te5c߸)z;1.YO[Zecʐc=y1eȤΈH됎4EsDQ->8̖!l!èP4Ll HTf2Ċ~ ի*kv u*Cֵ'/mz˳y]x8",K:Io''^ P7{,4Vq??)L4Z#_:ҎQp#R.d1Ym@P C)JՖXC@!:[*.6P'ߵ'/Ĩ~`iZO ",T,E+7:';]]܌_u OX^Jc|3l#IEAu)6 KrCAZ1֬=SpÌ 0yWj*Bk%cVe:uߺ/T-4Xj2 [Q\+ uxsnc5Zo)Ekߞ#nn4Y02= +مeɭiX}g ;E@+Gg}omb2qjmb}˄,S[[Ax-Ahn%ovbm(0M ;g?{ΰsV괮{ehFWܥfJt+ݥ3=mL``In0.-xHr3>B0`e/lF"S̺a(Kn!<( ;3!0JigsUXYW) yocZ1 %6ϵ2=|>zi۩0w`W"d cJM*:(NֶːhDvdC[Do'V/kPZ ,lu.2W~V5X"j@# Ef֑JɴJ_ + ,*^dbj_uR@zl!U4.RD6ojZ;h6PyURr,CQQ*alJvؖ5>tCmNmshSEۜ*f7#Ӯ9Ȕ[i嫢O5"f:@FG;XuuR:LrPe!ҥ2+˓OV焵$v`)/ĝVt_lZ0,qSwTlqTia",AAA*#bA Sfj G 9^-F/2E Bba@]xKaXvm5nhg6FOL(IN_yF=ʌƺ uenp``c5!kRH-YIWsPK˷4 Ȥt'z@O-;AVĐ5U03ME6b-uw=\ iNTʙT^z%I-_g Af R4]b\Q5(Qb) *+AW#t)'ݳKx y L)tFD ÅWҠY ),&#-c>0` 椐UbVՐa0K$Mt~m῱T8չ{-yMy5{!lw {\|&)~odJjcۃH8t@qeOTR*L`V "È<>gr~t.yx_ffGe)N#5`ug]m(|/Z |leGh@hT#;uQ:uxu D#N6-Z] ٣`A] 6a24NbT@"V+kJqՃY[wTkL(3JBߛ}ۗ~%RkA]q5mOǿ./>|!;[]E_[<= ^ ,THa۫r/o cR?%E~.nҧߞWEcyW^Bc/ci?Qc:xũjDH/h)[Q. Jl2ycSq :^r}{!/2g@MG?ȑe#ɧT<0±@9hM #HrپƕqY ;kt+?$$RfX6ۨBh OOKs)J˹o:P!6ZYe{FiuR[Z|b2#PfTHE8'Ɂg"n[pOzJk!7dwBtA3()xtLB:UDVK&e`Ӑ bc+LNDacL!U`zr؁S!2ն8Q$%Ġ`X;UQQ1n2p hD=E/ qJ[4L^*N/c`2E/_Qr 5oW߿q .#6pJ5j .(G]yw >//KeSRxTjl}J ͏5v^U揗ve2"вkў)[> 5zmNSrޟZÓhB^ KW-gmw"]Br9cG qX|6v{w Hְɟ g J[8vak.c\l5k=o]B\1hM@>l6l*ݧj -kT1)T7BBVnC1'0fg5 pP! x `F-}9狳ś|)"ˤi!KA%݉8VTٟ9E@dǵg|Qʔb-9%Pe(9:[&qWJBh5%@I`|2VW,:T M֍@߉ ea[Ys6͛Qysb4X䘎n7 s^~:Ѻ[DTȄR&jYRg)N"tp@o#R6:fWRMUA@U R 䔨>EgPp|h82J9~@2 g,7m}ɬ7}b N}_J^mg\.!eߨJ#c5L(Z y9Q[5F,"Jv/e Cta3_wtHQqFAǗ$!y&u-N8t.;:Q18劮nt=W9PZ{.)BNUHEߪIB28`q\|\APϘSv9n&a~%R*Ջ7TT4.Sm6zt%s4-5/luy WjJNɓJ)9Ȍ[e@~]x*ݟ֙>M*JYT\(*[[cwubjR28…}K1Bt0| Yh S-BUk|XXyQضj&0Ʃh"g^dCJBԬ#@(`Bt*FI-*c:;\EOl@>y1C8TQh(ɡgږ_khZZsyCVӣ)Eק''5:>I6*9;CVojo>h#'h4Ӎ:/Sqerӻ>]r3I1)u{Zj;eaq#Um j#2-p+ EvU-#Qs %)2 8PNt@EP\-Y)fYE|1\۵h5:B[MR1vqd1@ 65ifo $L/:h֑?Gyɫ8S1/G)ِsq۫؃$Ʉbi!k ƒtE_ϮKbMN̳9 %Ă=*>M)m XRq@q<ٍ94 zňw3aeXVo3LAc9vwϯ_zgc4!3쓒 6dd;-͈`Os%S皨{Jޢeݹ.y4(Н)DE mBhRYM #8(?R"٘e<\GP͔r\\'e~Pe=bN”g˓4~fOwl]ugءkF!~e sUUpZ΢ZҀ-5<<\:6e2MO{HeKAf>VZy~*F 9crJe6x!vRܶ<^Od0NNoWwMH4>) )f.{Q'oq%]VieZP޳7]ǂGD%E,VPA ,+-~rqQ(#6D_xC%˜*lkMG|.F9҂ܺXӃQdlklp_G!jn=L>Ŕ9Err *q4J*Xg ݱ.awvv.SǤU14VΡB74TSk? .[-ʺPTlckSI,}b\Uv^ gѹ3GRQEQܚ,MJyDᠥzm"-D2QCTnreUMEɦ'b:d4GYCJk!4(_ߵ[g Ti@Ԕ*u\A ]50晼ѡs^JnLp*F(pXb'y3<2$位uxC^pIjyз5#I>Θ%2*+elYgY&%&#SĤqZtkR*[J)ZC:TA.(8wWjIJ&K(tin%Y#Uvwxl>˓ 77iXj㦂7Re δ#ؒyMGh ӧ턺O4 +!W'͍y9xexБQY%Ir2p*["q]NݷOo%h힫 >6~_M?ye{[â^tr_@y1~7}׆]|?3-"{"+]Ekrߒ^׹1/U$Š̌ȏb0LfZdKrO%,ѤJrR6LbȈ)ǎMr{$GѬmߌ'/]x[k9HfH|.ddlIR-mpMKdowB"겻K6w} \{^Qq*?t~5zb 'wE֋WZ̾k^dx4z9Tcʍa5=زPy?6,:Ρ9T~ A~{!WympNj{3|4~y<}?ztƊ:좽r_~odw?[[ow`_/rW%_z|4N] j _{#7,GLBoJ,hr_{"2}AO_Y_1_n8?ŷ]!V{2~TO^busts3e|{Xo_riv;Zw,~}x;uK>}>w1[MSؤ/8,bJFknmv'nXm!8RCrH;f/VgG`y;s R;OלTtfxKg!RjؿNGu8A9scbsv~vw՜OTs (Ggj7߬6aJ+vҟXYrc:Ipv3WYjvτ- ݪ3ꭢXG nTcݏN,jN:O;2<)Ogj7S{U~XL sۗUs`Doàf5Sڍ7 xv?&4q~"輠6x٫UWV[|z5Xfa/nJYJz\ ׼|:ԋRZגh84`uµhRmw/BIKJ^9K!#d0,5Ҝo 0\ϩU7ekz=!=r圡c8g9CivFsI}TuT/&> Y&K0't l$9f71 D9>9ߠ#i[nw!>xgO'mPl, fBj4xՔ]ba$"L1\ <9=mDe#YR 5€%)"K\j ԀVT8ڦ28tQbekXOǾsP@5/ɡMއ!Xx+8';yT IfU!cҒJC?:g {y5>ZIg5ZeN\/' 0ToNvwT;{aq^uVƼCes:q̂b7о-#״e)RR-REm1/FG.6%cRD;6L?jw=*D^{A8/?Q0\vԀ3b(X8LQ4oVF fDY+Xp ᱠ)tPO9MЍ#7*9`8%ͅSip[MØ}+uUЄJn% ڠ,&Mrؠ-4 cRzcƋ\lo ׾oo?SD{u~p|r\lz,\vSoz%T8\{/kČCJ& *TMsvʝDۦ\ڃ6XcU`P/VVM1)jDN~yh7@԰MB\EKlVdUiSRx BuyU! }zⲃ^;, GkPMg IA>h}Tm1/*`TupfI1ЙSʈ?k~XY+{;4\\}&ʜdNRS z-ViHLKMPrI4y\TqrͫX!RGЋng UtB%_u}Uq|UsbWC^.'_u}Uh4hl'O@K0]'9b`bV́ eT[O~x]'Ӯ]{ }248 :O +ɴK3s-H]nXF)oO_,Z/jrne_}y8"=( fyi1B#%=z0؆T硈If6N/^Ww15^|?\<+W^:pxCY hD9#gEt 6ٟ:!J8xEBrvI; n8 :L@rQ}Z)9UFh~t~|=ٶG0lN9Vʚ!ؔg`Y*ܖGLZKdY7r'{udXs]򨦋ӯy&3psP& ~qvlu3teVtLnւѨSYV<}"~2FcxՈg2^0Ԛ:_ n-5~TSȃ-ם^KoJIf./.O; A E*tхdR9a{հ>K"Wn)֤x =I#&k gk ܺSx+6FҚI̥t`czIWCGLMC֊T>bϫ>je4\k\/Z`N}|?^nD2і3+}!|kCnTW^I0V*e1(U"8=s_ 4%_# Tlb&qu˛#w¦)͗) ʼЙϴ~ ,[Fl>Pnb9#1wωJyyϑ`H (Ȣ>W·˪H(bܳN rԢI;yhX z-stycv'8Šz~eAE]zg.$fg*_Kߏ^ eKb*d%[90jzDzY ^: 3:I Mmn:sI&~%F B&vv;Pj _LD8{Gd+LihW~;u퍤`!C5 ,We*kx6M9ϘIr˵e9/Ia^nBV5v+TNg~a :TQZ`Rn.X #U4^LybYyys!fmTH㎣q֞TL3G$O>L+ղcwTNwpFriBu;;Ҝp*qO遅Z+^U0GH-FKiK}VirR΍ֲ֌sk5{85ٯ#T4@A/dth9}c8H1raF_߂F<즎pBHEvIk1<ˮ "`@/Za/*Pg)d΍ +㱿 # g҈ӶI~έ\bu{eZ!)vO p2H2;cdSTlf ZϴJ֣ATJ[AVԢ,I,4j;(RfQu(x@Y;-~f<, ~ r|x}T %j;ضt>hۦ%j³T*XR *U)++4*@^iln* |ahRƾD!) 1;c1f@ZiEB(D~v;dL٭}gy`jREdPeXkڵ;I^xN O^ޖ欝c-e$/>yU@$w'ak;grFjNT%ѐTGSR9sMְ]׮v.2{A4QB K]|.!px^ͬ~ǽT&?eFW*ӿ˛T4ĵ)W5|1J7gޞ}Ǜ1vs|t]8s(yQ݇~>Iu`򗡸0Tw7!N7xdzSd|lMZIvJˬ 0e~fϧI@fl1Rǘ6&s $Yp^/Ԋ>G,#wΓK\@!ԋ8MF6WܩGgj?'eN/4vq}^#0 rq.~nymɓc?CEX!ݗ"\<:"q[ JFA;W?(lWg{ZVfU쩏f6*h@=C`zuxu1#OG ]^aQf_dqw!T 4R% +xdf%#fQe=5 yEJWprmpݪ=_7'&iv Ӟ<"kD^}#+|ep23z}8pƢXxoEk^n{^R 0* B:(JU>* o5y̋D7#Xz§@ ʻ Q;4s w U3 ˼T#+jw_?㉕?\>;s} ]>o~78?;=z6軳'~WvٞA8|.p8\H'3 H9(()imʙy#)oru[أAºZEHe.rWoxщTV(!nDEEӟl<s]ΰR[˳}| ޕRݮ4v) 2(^Rw.rCPqo;yGUKb~>&G:KVpx0%v{ u<*]8[HU$t_.HϮN'BPރy#6,cxHb9ŁEu-o'b֠ T/k}ZjeMtj99 yshJp9kjm`?6c؏ zóL"9Q) *HSEt"81(,V/PY"Gax &_@A#Nۻ-M97_g4Tۋ×()l"Ztڠ ŜmZ]J*Q1<*6iUI'n!V/@] 2ëK7*C,!ٍ.ڧ^W+-K-y[n[K^թ%B@R%"];2#ťn/?Z!UuvyvL+ZBFC0 _V$Yb< -+AUV[InNWbRO a\wW:=q;Y cLߝŠ;L|2 ^ +6 XU ! ߐ-JR:wg^a\Tj&G v [+ztH4}QpmR[$|;LǐFLU+jYcc- dۤd20V3BdD+DQ|U,scԅrJƶ4N(/V2?;h3vIhz#&E:V LSZ9YI/ؠ dZ#K5Pr\& 8?7( ڢbaY*MnWJ^"P!E4dX]s|*(Jv/J.24R ?wQZ0t6We&#_d JV_M݋uX(NlPEO<ՃIKTJd nf6TwrR9z guM, \?/(vj:疕EmNC;oZ{F{/V۸:|~ǍKԯ,~e +K_Ykm݆KF V"Wk06IpDHJa2Y`g VЌ$Pl ۪Yv, 5插A8m(fmA\5h9щ:/8k4d*:֝Tk[? P ]qI9m)Vnl#?A/ ?A)ǷYi3Q7t  ԛ<^?Mӹa>L3LqY/~Xu`H84KV U##96v QM RB4ZF5ĩ7T`pE9DTRZoC@(lVb"7 QSqWr͍D] F(*B@GMi"5xUJ&B8 "*@~!cU86h} zVwuiP%7RΘZq]җ&vߢ)F@Lg 6~ oJ\ľu((57 `bl*E,q1a-G)4U4P-bR:r.FU{]|FJMN# + ՂH$#ݢO#*<X}x/?MP[vPjAm: ʽJ޿bAq,51#wC-{}Q^~u_~f 1`&E.&aF!զW&`Ti . Ҋ(Ed "ϓGy2__pu gLlJ^ewκǰ|CMS*Ogb\cawfZyfIjys:YvaD ƫJu41Ip&b .":fPV.3Z,ňPHv4*a PR$/{S Ѱ~I$D"sQEa\g!Btʆ2YSuM0 j3o&$%ܽwvOr =-hT+~|,9A}ڋT d,,e+#lQWDKl9 8=^ʜYsSڦ"d*UlX>T$#Jތ Qiզ"^*BbIQSE>jS>P[(̙jSO(|_vke m FQX1 E+:8sD[;Y5e'[^p+5oHyT@%ۢ$8oкEa*ZWj ;\JFLHL@Gºk~ؽ3&^Fe}|i 7_wsԂ,Ԛ^Оd>^]G!z#%'?w̚JN+^sdO4遫:|Q#iS(=1;,#NGHbUs%5X%h4yrǫ3c)I")9ݖjPEK>ahHfyQrO =`ł0I:2жMO'9ڪk/MՓZI ]&d@,LEJEy Y`aD TAXPЉ\2LCY3Է^I N 2rgh#1XzzIUyVGi^1t"FSUO}2~]̵,ؤVL[SdG>-]ruRv*b+NdRbrsX<U;{>#'}'n?GFքռMMQE <10"s(u_T %׈ZK b> "kP.v/y.z>:r lE-KY6Jo曕v 3. $8^Gi66U'G[bPD 7yI }Eah:ޅ)68;%d]]i~ɩIr$)ȋC2.0JW^<ת ?|iT!$mo/G5R T  _'H5KsN볾-tWjvYg uԒwTJ`\ӀaƢ mXm c68'm=RXhjmL-P|,mV0QajbvIpf7vgoO(ۂ )j곅r*fKO5$]kC HtD:xk-gsԺehNw*DuYo+SZ+بb"-fŽ^x}j{̬) m]ˌZ/l,z3RS}~δqݼ|&;Zrq&Exވջ7,!UcZ3E6cRɭ}{a0s&ήǛ7,XYna?p֛'{g>Kf_̂O0i^y6:vi- [x;٣"2wljRP mt)kVV0!cXl@@u=\ps8awS j;>M}dd'$ ;4mm] ozǬhqJseF㫆ٲ1g,G*YXt;;ri1&]P>zYMX%#+57B[/S~/?LMk~5jziJzuNt0wW5_#&{経ǑI)WĂ <+q1J+ Z&ah{6 )תPK+O>c%oToh Xb'@'*:o=\H5f+vSE40bE@:ݗYE /2E)螾b0d}P.p1!"FHXfQDu;PR+͜4/P`,*D/xGH]7eIK061AB~$|bM:T #Y$$,$YEf)GsH & FQTd5va1"]S4 LJ#M͜E;FD&6R%:Yk)N# 7"aƙ  ƬKq GӒZTT+XhQ§0=8C.!EŤ2~R z&LP@:-VuF =D*mm뱵6 c#HhZiɛl4}uPcK (FR!FGy.G%1I|H7? ^5w1[>:SW i&B̸Li/@#z|^q1upFV_^E3ƃϻ0|2 ſ)m6qicD!1pl9qQWUCq֪g_0{M~z](h Eͨs妝gra 9 "zrHbM `F2-'8`¯ |fp1ꮢck"mMl! xv9@>BP [)UFX;&+(E,qL1Xn!¨PJd1Vz91R[ ",SIhH+b `MJF3'$ -pa#ymuȨD  . L* M jUe0 g VLM`ɎY_Ȯrd|nA,,1𝜉zivAM؜*wt>P^MeͰ`Ҥ&JbU_~0 JƮƑktT7UexGHg3[Deawr) &*pq4I Yccaɐ& p,_.zMJYkY.ζXR﯒1%(@?u (K:6O|}NwN)=ELE d-' P\9"WA έ: Cmbrb$"X'Up(_F̓qljHab#"iGP絺)r|+r^>4%>¨LH5 G&b` :(06[GUDf=.X$Ckn=mu"$QTnj$WV7H>@GQA4nhQ1wR7*fՠĎ6 A`I*]hPJfeTs+#G뻰;ZV󺖕KU|.vXZvrup k4AFEZYY%%`u%* Naa"ܪ]E 8':jV>kڿ,K8Ja+qI5ņ`#b"LlNz8 &Q""a!KK:Y/XAT=8g{Arm## S}ac(.e]9r DS{r",|dQly@5:{8xHY]v3E?]0QDǒ)4*"$u+wki$JPr}|l#ˌ i}3.Pܪfsyهե vjq & b{Jju9utѤ"5$@]_}9^oT Tށe @s!dwfBK 0kMPN(E;,OϹAl҂yG̭G d)1",8Q,Ň_SOx0BWƄw4x5nфYf\ӳW+uSC!Dc)>D”31aYᣲveE#3n)>T.4Qԝ$<KkU%i9"EPf+勶 ؗof5eB]Lt.Mk5ZABmAw߰5 i?Vrlұ?pRJ?ܿ2H|USǼjFޜVa]9*%qK'+ðWqG檙2W BsYFFM;"LD-ya}pKx=T.g1ͶwS׋Dp/SK)_Kd}њmt ÚD7ӗbyyblg|ƓOO]>+}, F |f opťκ6Ӌpr>J8{n !8.{pq aެ9,떣R/u,33b{ӻf]*@P]h'Y~9LlZ{A?YYGn3[DYsRw0ϟЫDž;%bمoϟN/g? 2sɑɢԬ>?X˫v{;%n@ix:]a.~>1?0K-f S;An_\zS*v?нty=?|>ώӺBAmsJnIe4K`f6Wazit<1Fo,}q3L^j:_s]e=XءE#.z:\j>蔓0feRYrwKgtqOVK9~NQLͬ󱽾 uxЏ>>O@zןӳLٺpLL`7oOcoϟx{<}-մ _쏮n~~̲Q]~{?`"]EvƣCJ%BҏCx:?^bAwO9I ,'t4N^e#WL&O4sȂϿ oG>p~M2e .!o/TF/8jYMY#s+=XEtd ES]a|RsϊAw{ ?0Xml3ogƅBЊrE;>'XɲA8Y9ZzzNI7m+joJ?ޜO6Νo_J3<}>o{h3?@5Fɰszmk 7pۭͭ Lj5%3el4Cː6\>z٫#ܵZ W@ygYx)tUR=]0$$Jԣ͊|a8V9Y<~ӀrV@5 ұҜ๣9AtL 58"#>8`wLŨRFf&?KmH? SzFy^cD(/yjvwbƎNIj:OYHIX€Sf1h $sI0psyyez}%I`$ISlaav߃ΣݍGFŲav_ b/l =A+(_^Sbw;OD]fDqԹv4vqra&'NõϷ$ fBJwh&6suwٛљ׎?vݹ1Աb4vU΂~\L<̽5~mCtx0۞7]&GOmOÍ;鸁Cs4XLi@4Biܼ׃wR,|GTl] V BJ#T' s=Oˈ 㨅C.GeQ|+= "?<&16IDŽ,el"12!?sZa H<kR* ?΁x bJVGWQUHDsl:XvruBX< ސ'0 K [t%Bex|: X(JSIՃ=X'<#w|'}t9?܇#=XB~ͫB^7Q[D>3nh&~0{Ђ>)buxIJɀ%A=EDsg̨sO])% =E0?$rY|ٻf5/e?]vX}Zv(oݯV%iKEn[eᢺ~{Qū.;= ˻"!Q9IXnIKul? y5GA-r'B~Re?O GW;L?ړǝR%F2SE:mJ s,"ҡ{½th:6~ͲqM0ټfXEڷߢuT,FMuKQTb[^`{:q (-<{ f[bq7ڟ)U{Sd+h0LTJKZ ؞m|^WyL a0zp5ѐK" Kp4k hS4[xs;FJWݮSRQHNCCsN+>@iuuit5HgFrsiA)nQ$wiޕ$Ї};CC`ٗ`2LJ,it$ դII01|H-vw*ArDn(Z{ЈbL9F/EY{`ЦkjJdlO3ד$p݉D[X/D T3Dr VB)Mn\ ȑTR^pPX|678(ݛorƵU=NzGN-绎 WBn62sx'ACCQB%$Z!̑( ǘ9HӀg;F@. VwPDauIe#(hQ¢RGaq ıNmgj O$J׷kvzia-0֘@ 8BI1rD`aZ0-",m qSU]̮ !bҜJSII$Iۻ&  J\l~>p"w*`(25M@;N@_Ps&b'5Wܓٷ7k.>NcQY.P#5Nl}Mh=_*Yj|"p-3V0^7B\72aLr?ڄs{bhN(<,* M@2`=nFKcvU/S,wrޓ#oLQtT;u_,^x[q>KE'nG' x_,{Nw_$d\1ٙG&87'juCn[5>4:#ΰL>ݔ6A9Pͨn;qDzcЈR}v{E 1m8ҼˆWqrxE%L϶R2M/qjƃ 2~'vUR1P:ȝ?{1LtD"W\iǖU^[.,Z2 ,j/_n3ϴ]`=ӿW}޻Z+ g5wsmj>3]WS<,?¶zvm` F-ؖv۷,O@ߛ8pnl1f5~BX7?ѬmƒB%[93^ ϫQ0m]LFqФBiyL:=ڒIAWh [ADO S1}J{&I*\; ?a qᇔ#?( M$sq)^]r(ūdc"]p`mhvl]!9zKƯ┗'/Tj2T)y^v ҁkSiZ+հA<È2#B)~՟ġ}Fq˲~&d`#-\|1!B3 tkPI/A0|{a cJ $'5ð NR8~V#?f!g/ a {+f NEf|W8Chͺms0AYٮ_ *x.h3u8 v4lĄ qL&y_KI}7ں^zO&uu)ͧWdF0V%HkN Jx-Zŧuk[]eh+ɱȝ=v Da:Gg0#Χǡգ!/ӱv-zK1n6XO7M獙ٸ{} V&~0fl5s c> ]`T(rSȊkbdT(4] $KY356]f"|Qo8O8D#Z঳ɋfd:ǀf ( I}A>pU츣@V{ݠA7w\Ko>L[5^x}B@ӇυáAǡd }@P,y !k_l‚ گ{Bɻ@^ml#6 W%h+VJ~ 9@2$<0 M@)0 h2Dk! DN3_&#˼B Fg`yniyvWSYfn: )|Ç?M ‘ꅉ=&PЃ^L% Ѥ?2D$71-إBk(L+-o4hA&QM&nu{ <Pv>=t8v,ӛ4rS-A8ŪgEJO3"-IC/ĉT@ *kTG! 1)BI"@c$Fq#ȈgvT  =)}z6Z&Sdj|2 E#!@5 Mn_ZX}kbDLbH,&@Z!#Y9b1!A3!211%% _Uj4.ZD?o~LUOהJP,ff/׷_Pv7Kte$!|J(0a1 Xê$8 WPP̌xv^B$l9+_Ճ*Si8oRqysEߞ+\ϟ+PHQ8 ,x)BXca, kXhSb2Tۦ"դ-2M2PXoEdy6%N3Ka!qYx¥Rɓ_>3Pm7nqrM<)z6]]$f{![s>θCvvmBKf`o`1͢zh}Y~\ *~^Ɗ-Z{B{;-֋bO{7 J_Q%WN4}w j- ùJ k^|šIsO+EΙy쌀.L 1c,﬩D]A"cSqL)Wq`E,o.8oDűFJ4癜ƫ0&2OLT=/3p[=Nc0&b8&jJY[q3t6abc!INq93{-ƕ-cXWEVeوt9R V)~ƚ%m i0SY FhΨ)7TVnvuԔ%T0?lu{[Opa,RȞƞ<ɔaʚ ~%ʆù*Nn{4k9qkfztڿK4 ;z0KlФ> LbtC_31]gGtp 3Yf{2ܖeiǞABv?^݅i>5]+S,WnG>H2? ^"sv19s&-SIDH^njF䰃#K/)O\bmEB&|rWV+HåR#&8 J7'o Armݛa6]zJՏ&܎>6oSrt}7I2w-3 ,\yC3+`~fo=M?9̡e583Wb!vGmIJC}|~Oi=T(i9̲y č:· (spR+ K dj=N6GMر74צMjA =g{^¿FW0J4Qŗ]]2Ӕ3rN~ |.g55r %99yj?]s$*Eh$<:Ѹs&9<\CNdpԕoR =O.^xZ_D)u<mj^jVQR!^i dEl2gisFCoh Be/m:|Cp@FIUe4QPR8G li8`3+ոww.کy$+r& w:\U .@Ks 3i-Q".ꆥ58: sigjn0NB"Yb\ZSD=$1?RPiQ ɾS1- {U3 o SnBP|ODb%sTcqa $֔$4$"ugdj!6qUw|;^'O&\P0Qz:Q%VuJS>bK7~Nm7 4#"84#i>ES BN%&.e7Y,M D]\MiF@\%@zCU0js6p>UB̵~rvB8B)UOے5+I`-/uצ+G`KMw_?L&u0|/O4X3 [8H8&gq,}0c][o8+F^3X9=@=E4mIU*glPv*-LIVRѹυ9?S r=52+"kacX6DHCsҩFv[1kْ}Zws%dwTԑ-i`6:S~۞^^LXyk_?=Le|i$kۏE>-?cu㸎y1㺙DZ~nѳx)ajꑞ3CX㬠RTk4t5HϵLȿzqƸ0}dmdp·xYf ~m(o4%릸m`AyV&:˾PO5:А\Et&>z:ϕQ~2Ǜm ӌ<ޖW")O\;xk1&^qEMd3Ol=TFnPzǎ@@@ "b 0gPJFk9tdݿs$Z7~0%jQg~_S={2kd\0'z6Ͼ V,`B(s!L0=tЎӶԄ У7@JUҊJP^*&)QJKuT ,WhE&_z|JٯoF'$5BEO9mQNO!~R7|Qko xQDRONNTJETh:mq33BRfT0s!+kh#Rl,q%/6XFaZݾ؛!bx=[8f].L\˟FwBf)j^,ۍxihah'Ŗ0-A\ =Jmpq9ZF݀`gUU.p_oNcU5!.[jQ%!,T] e늰9'3ErF$y6}ʚjZG@D#5,tOzA"+IǓv3=3/_纷*`VƣH rLA'~P [YsC]p] ?#H{3f-@oA.S̆5:VOfгl:'\4{D@UU($ O, n2(_9v_Mqy NxUCX/7%I;Cj6j$iGѼاiVFvs]WB["#A9+Rfqt2hzn~61J1vCZKzOl+JI(cpަ]?n$")~JVzxMTPR¤HJ(+nbCgqbNɋrw~І0ń3_ov>K|5 ( KHE"J Dp˸05punJp T "RCtOv[XYefEqow= N*ŕƍG'_3+Mr; Lǂ@g:ZQ1~Kdɝp@+qVMw6xѶ\i(kIF?(EJ͙!(*-3(Z*4)i4,Nq@-!K7qMPZkt`E@7gE[bsjcڵ` 6>3g㌐|djS"Lj>@~hGp,b BhI6d܅ \B7`DRM &VB$6ڇ{UA!Y{[v#K2ɞ'jG"IK$w 3%=Ү (:-Ԕ6LM7(FDūq6usI&2PN)rZx\GF|j"'a)1fNhD:A{hLLFFi:485FWш50y4b2Cu3E[-ZaGd^ĎJ`2‚#׷x%VhG.iqʨ dLl(uOԆF&w`K;HӅ))y[~c@#m&$1o!GF0\ "x(:ˆR%TNy(I%%.$=!V&-iqy0(`Jx SϞqƒU}ԔNp s S@ciFglE(@IK xR&Ɨ`gFӚvkͨyJI<4zg|2N``ڐFuGCE\ }ʇ7+14PкftIn-1J3\O/-Rx|e9Jio,?Cv%wy=IFd2`icYWXb2PPrQr 12im3pp"d&uGm'dȂܣmz"?R`HTVck{Ơd[I:…J?qQ*nⷹv, gh5{L8C]%D7\0b`di@"i>RzLr%[Itx&u.)f\N﬙OwR@8z<-TO[ P3%[M\qcQѽqVPKqs-O)5XFp!VMԷ%l6m!g^Sh.8 P"m*8t8J-} g2FkSxxظOyf$?T&pB;>! Ivް4,IjyoWVب0ĵ}'ȂjWDڎ%l+ut̢^<9C$d Wۆ%[&Ï`o~l4>9,)X|1V)>꣰FTNS6oPK n>J<\i+iFHMr6_"c]]ؚ91'@>HHQIB[%ZN+,hE>?[<Г\̵w{L`fsKKs[,Bb9eQBr6;#L;X}i\=itUFT3>P^V{ 6cvmO4/ /qR촟쑣bN9(wb4hx7Dy#_E=S:Q D"bJ(ẛTskQ)t TUtQY5s+Ap^iMpN0$ҁPriͤij+Hf_>UĻJв &j^{R*|—WV[xˉVlK SmIUA%`[W%y"^)+4P"X Dr:R#tU#QE$(-<1 ɴ0Jd{)"P*4\ a(_, [L&O aӾ݂Hn㡅.,h]_\IP.uI} $.p(ZN׉X?I5}4 ..*N|E'jh:h|6[v2ΆSq]L:~Ĭ7r{2&87Nr^Ϣy窏 14|pMZ?k.=MD NPIrs)Y"^KT^K5O^#(U{Ny*JQZҿ#11ꦓI۟ Qzz+$$+d=zId#+>{ ^CS8q  j\r> z|$&F̉3RTj|p()M ȞV_BF΍??O*_gNܖw??&  &8˪` 7F3[#cqQv^:'!,.L{Fr$EJ dИy`}ŀdU*gCj\J1b Fϻ/Geo=@]x WYŧB};ZG\?JQjILA__lew%ehtzIٰ &9} H)U2X]ٔ Ud# !5HdwQ%iU}Cѫ>PQ=6LC?"&EpoVؖwјK3rQe.0tY7Bas SϛhwXYHѸ1r->I&͂>&tm+hDNWn4HBwrt}O/V&3E4^C :-؅0B1qE :eS0;w>X$(!o5f2kh$/ղ+Md/B$,  F8Xv#bbqv4D!a1n џh6'U(`lF}jQy _ݥ[V1o å{Nj$O-<+67|?,YEx}?ƙg(X:|_Og_,bǨ_( ?VE ? AzډH|aR`_7%xX2ˢˤufW:!΅Eʯ\[LIA0MD4[۟W]Bo/|G|/[ L?.ޞ/YcKm6hQ%+F&$Fs4Rħg9Sf_\˨" -!4[s7i%TRn_ٟK ~y>;)ĩp˃q_װ1bێSvZ5SbNGV:5kPma!̶i}L#l` &(YT(25BKzD* \90"HBH+*.8XvHE$D.b2*3n;锵]M*/HqN7*LY'C4QD4% ?|Z I3ǼE$C6qd02nj 0잷N}/Ln*>UEF7l"`˟dBeCVkw-+w-Wž283@M6`ZvWMNoݝ6x&FEGr~&8իk'}uhh|6jN}mo-2b>e=Oܨl-.o ğz uN`Y7%50{I/Z'IiG6kFa1e0M{e;j! ̼w]G=S98/v%!8 LJz߼Z{KmcKTuC-׀5P˱BhC q8 LQDqUZ Qef:HBfLD7 ]"gܤ˥]+xud+(9DY1nQ (c62G%]54#U3[gFl, *s#3<k3$D7y-ާ'jyb#U}zKUO^'m[_?.uIw)C ubtwGA30:(s.;#"vml U u6v mi`(Uͺm7{ "4v>" u׭z$mo2dQϑNG;RN[9;4N.7`d|oVzv*voc^(U=9EO%ӳKmOQ'rZPx3K18?˶#%GoWS>`#J òGH,H(GMț|1SDvv6)Vt)ڍ޽ı)IJљdֺ c%zcmZt_7эk 3жND;}<94BL׷RXI)\ Bȑ)|P:[^OhCG;Ćno7:Vh*KR-EAi&{Q3S4`1:6)#t:zL4.Q&ۨS({ ۝ֹVa۝;;V!x |2=icS٩l [٪2&㝈&0j:f;!MΙf8ٚ*ᤫ1FKv{E(`+ܦWN(T-pw9r RϨZS)^VRkr=__l_"D NKRfD()ko ga>>31mrs#Y,-yRcZW)üK,Yf.YƣɈfw-tL%褑dQ4!! b* pA E{;);.^]HU{=Pd"1w6TY)+ޫ  Hv25YSoHGR ʤHv[ΊRK C(:TSĚY*xml)dM#m,SR_m_$ĀV|uٔ``s`9.Dw[QXeY/ϑ^7FƱrٲU U#Pįu }|ȮU+ Ұ1mP>)1tjY4Pd`)/[lxE5x 'R4I)d!_ߗkM .CH+ Dn4uk.Uk"BG^(hdUhd 13+49& H^ZP{cZСa.3wa0 ײq1m!(vֶ*\0!Gŕ$7~k";M!)=%ؽI$aD4l78P=p]kjc pя)yo犗.1[rz|k\ɟrg!O;wlkUΒ/=cM f9 xoGLb}X~+˧_mîW-k7ϋ/N&&ϻw_Pǟ2}u R[sbZ|:Bh`|~=^ǭgM%JtMއhsB@MI0kvfrĔ(E"fu "ʃ=`NF -5|qt\Qvk_Hg*H`Sa"c% r͊H"TaeCI`QT+BIrB ymך rFX+S~ǚX~t zk\^~e\'n2& @ERX O,og9+~Y'/%lI^ǧϛ]2l6ToOտ^~ho5h̷e8 \L~\@ҌtY2^_ ܌'[.{V8_/yw4%KܝnXNy/I.yjiDq9wTUSκ5.);I.˸HIwy;x%bN{ź]/(0+TE)Dr}X"!1WccD9uB$?l-(atG~צLalb͜>6Y4#7o0n!F&;}cPF[nZ>X(57qWo++a䮡Z[#g'0~ud{O8hj91?h<˵c(̈́[OI3i!:ȧd2ڿn1HK@^'JcBwmM_!/}A.!%mYJ(ɶHigŽe3ٙٙYԇ/۫Ngx:"2#,r /ecog\KFSfv*+ W&'u-x*έx|?OmK0ϘĩΙ(#6$ NVL G ֪T&9/*+3G&39}S2KhgFSF8ߴb`q &i4Qe!=-y ja *ELP)v2uK8 *XΩJ Μf1(&@ѥb77d1 ! Bhx)ǙUY JPh$3G:eHNS53J'&;òU"rPd&߯|5A\Ezuhen#|1_/Q?mcO.2>~3D&f IZq5-MD =+cWb~az|v`GmZּ[z? Wz[]ң>o~~0d U10W+!J iw@X8zjJpPz+]5w̸쫨N ~b냨>HPZ@iu:5@ ѰbJXhξ1݁G`1'*T묐9aMAV(!}ZqT ӽ!!BL%6%{ bwhap΋~HiJ*I~b(y2*[/lfxz4t}۫Lq6J[D&ğag7捷xOsWYyxgL2|0F,hѬzD+yv֜I]՜ӳݺ_պw?CԅV*ӌH&-Fj̀;M,&T%R3K~omb?_||ufuX[߶hjrҦBqJtf`845 V-ᴥ@!Tj>QO*<9YmJRkcH#lf}8퇚FvfwP?'sFjHOF1ϯɽ PVDHᚷ=d5t\ Xl~;`7CG6!բio8lJ]ُ}/ݐ 1 ͕wt (|Fr'(ud9'cw/tlXq:McA4IlEmf$Ui&9,mo8PJ SHRNRQdՂFfi~lwhb>k +.ma͏->w#[{=P%%] ƘGrQ`B WƝ1~{S./:I߭9㢹C:||&,``F;A1 F HOi$`lb2$QbJ?-vUŵ 85Y΋Ŭ0W̛[aiɲV俻{C]b\: = &V@3( Ha6PMf(N$p cmeY9>}fI: tmRR,\!HDl3[At'2bE[i娨ik%axD{~_\:@|~^/.Zh 5 TVUEZͽTrlci s9B.2V= Ț[ea?j $Mg  LMlnP\^G`nш`nŖn9?I\8C'M1m ĺيPϾ;6z@aνFoo|!g ˙r dpC G^}tfgzH|,Nˋ:.v2yMNl.1oI.'͗cbCCT2q~?~?1f 㐩UaWwo 0ӹS 2ӅS$J0x<s}g9ϓc%t&VWeЌ~_ Ia3Ci6W/^2VVjE$ew>c}.G!hufrڜf%!߁CBf|-5Z5(ὡ #7d=^\F}diKxTAU{*>svw ǺhVR֭]]1ڼx1o{>R=yך,`6 Φc&o= 5\|u\CkyÃFײVpԈ b&@a)p4`TaFsIYFuF bYdLSIFHRH4grHri;&zF`fb4FG|/׿6uS V!OCE5i(Ge+A hDe+JGZ8_`^,;vtG5*}!5J Bcړ2Yž=IRݡdvrbW#=YGW1KSOvwt@;h4z߾<+f3(UZFwZ[ ?9R5$Tb}/Lfn)\x=<~/>'v.Rڇ7g2bZ;l6Z7;ZeVвP|\|< D C R(j.HbHgLbHĐ bC ޯy ~?v^D ]?k}p65Jîri=D3NDc=OB0S9/)xfw aL(DDrG(W* Õ|u55qb*8B+k_^cpڭ˸DDB!1,vL$//)a"0:/]/BUg͑*}y&xcY 8 8gV،c0gZI"/3)5XZRj̚3Y4+ͺnxY ':ZPB)cf-yjG-#Cb:)~K1k_zyu0k]ubBl6fbS@mϽ\nM^Y~.~+̬s:] Z^(*(惥uؖ 7q-AAy]KV0lܥm d:ӋUdmBZIOc*l\ H7.dJMVeDU[wǴШmVвڭ ELi{Pi ۭ.ʈN1hL+gn -ݚo\Dkbf-h/l<(#:uǨݺGiڄnMH7.d8FRBA)ѩ9F%]ܚ4o]%vk@Bq%S ,P D{jv| 0%狻&vk@Bq=YVS.]l]NVb^;zmffoNh%^UOg8$GP ]z+BMnqv $lM9(B03]Pc3%gks"#[3f7YIl,8d ,d>W%)R!YPˬWM , :9[X&kMX, $ d8撝fl6c6c@(!f6c6c%ȴffUJɱIit,72(IA'n-6 5wCs0O` E`IJe5a5( 6 +f 'ybFR+\Xb#:ڐ B}6$ C! .Hl $(ؐ!#ڐ`*6vKSYFU(<\ jnXM퍣 `ӱ bmcrtb͓ϭa<>sW3;z#p%E#j͂ӱY)?.6w{L0=#{LhNC +-YMZ`[k5 5o,k '_e˰M8* 8KWOͻh8US*).pT)Nid覺TaST\%h}ٗyO `/.Bӹn*rsoi <9|OZsC ˒JVwRꐰv m;Z .qr)h1K2F01&NqߐTHr>$]2ot!J 1BHt%@!i}5jNHҪoUUwNrݚ$;7ӼcӜK5 yOLsC-Ls.<6$ABnLs<ԡ!RM63ԹOB?}m\g'y`Og7Ζj8[Z⢾ V-Llf{8[܆QJf0'22L * aG"]go#^^7f܇(Gܬ*HN:mY'Y-YU ޒܬpJչU2 nVnV% mYͪ $Yt^BZ̿u?Qb fI䀙rpq! q`7{ЇtL.].'NK5WWz1qߧO?oͧcߌFV^D<t? L0c;̲H OtND:WIHq]Yʸ?OC>/q֣uU~Z F{v <õ)_O__6u>mYvf2#P}nڞk&N&)Wݐ Yx)$@BY\wk.(_BIJf)/nOV'z8IlN<ў}2|^X35mc7ƱO@>_I19B]gG k Of`=}_ܞa7>N2d4]c~x.=7ƾX1"1sRKRל>_;3cF.56q("bqyqÝqAkrgbhb$ϳ^Vr߄80XXrJÝ&pNNߺS#\XxKrM1o7!\<.4Q㔟wo޸8h$ )QȌ66 $6&py@޸ixo5s +2Ё ˏx\R2x906ڦ'uy֋6KVRS|,\Xeu,eP"³]q1Ԍ+"/ˋOJ`&|ϗ9WO G,3tS ~Q{D>[vss0;~)ХB2 K!gmX(rV^Yˆ7a$|pV C*6l8; Ko8+Ų^6$a ޶qk=?|c7nɖ5/Icnf,$ZؐȄs 8&C'W(cAT@x!z)y )Gv;nl8O3%r5*O4˸J+&JC_5Oc;kLJ ]KőϜ|$rFޢd1R @tLQŠ[D9@'0:DdRGu!ZRWw&M'Ht &?˸~g ϭ&mL[^&hkH oT?t}C@vj. dA d[h);!w*jsQAB+D.6 Ӡ;탬DC-0Q K\<,q "/=TȚUؒu-Zoa:|6dmS}9 &#j|9qÞ};75{861!3Ά$(`382:IY*qd%IZs{pdټgR42ߒC뎅(̺9/Zh{|QayE',6dִC!h`Zo"ׂF6P?#rC5٪f>jBbνѤ$ a8\eoWY½}&s!ojLZ9Ty@tJ&-Iz}&2g0=><*\?,,FùQLyOJpKgФfhoq7&rWn"1Mr`ww]71jkTAm4w9EKi{mB~"DOɃa< IaZ n1fjf,I1]80A/?:3~5<&3n9Poq1ߏ\tr> Sl^j6+Gk e&1TgaΕ4L@jsӜ+0N:mgzuSoeRAIq+|kpÑRbVji^hNatCI(IJcE*I1ЌXğ>34JHbJ9N^d(ELk5K;޹%U7S9 \6f?ٕT}{*7[-:4YM, es_Y^oU0\.ov<~ B,#mi:xuO!PM{IS$Ԡ-((-BmlIlLd,qLed~/d0Sr: cI̸o'qbOSa,F:xÄpŌDҩ#r (b? l;P X RŵCH()9 AÝ&Ml/e)ЙG{bRh-ktZ׭Zg=` XAx, i;¬{!n͗Sn:i-%8n 4Ҹ t'o ޿na +NnB&|d Yĺ+>zG`>#@g+jN Y BW@*BZ">NY<8) ԒVOb;~.TR!N{+j6Gϫh8z 1p\uJvpcwT{ 4(XS2$>,MMśuw׿NC t8z^zs9ⵠˡKE'kU r@*jGFTS*vRTyR}E6[[0 Ipۮ뱵h)' mqxZOV?%p{Z| ;녃s?;%I4RգM\\ީ7S>Eq)"]S')2L%'&ğϹ<-'Qt1DS Y"χ!X[~7qmØTx&" Z۰߽tޘCDP54٠;}Xb0ɡ`((%`'适lj?% Gg%O֬X [[%%jI>_)}$QHP"%gQ'[5R:n4ZcO}{x7HRQ)*}mA_)zha=(nM0eINY(H.5mMg差TԂju6Vw.e<)F;Jթ{Ft%v@  sv%)ӣ7$j.WD8 2/yLSxN/ А+eY^e-Z,yPcʰ#}ɜFibֱc& h3?7@sYW:@܍?vPʹ7%:Q `#փC2,G=LhIN(ܩjBjr$׾ߒPYXuJ>R*]"/ʰmrI IF=L'}[r (So7|A3kZ+"Ig'icMٌ1O[^խMC$Ιkb/RDaem R/Ĭ>hX?czi9W"7LJ:'iE [%.26-dzQafU΅HQZ9|@qJ?y]gk󞧣 c? M7 15U"WgW^\n8qQX݈j*G(*1>m>1ن $1u鮶^o'%CLia}k$>=̾X )qmy=sb{xeL`)wf$BI Ngn3*ܘI"NlϜuKzN')1{rLօp"cS䪻eH4Hjޙ KNdn!UU6KcD:q3NԟsW=S9wٯ98^%꫶`cFYxFy rDd:gzHrQ3<%a^ й|w!,c-SdWw'A,Np٭s>.9߲05 %Ӄ1u 7|uuVM;^μݯoVf< 7aEWf>{cv<DW,$/٫h]e]e˿3˟I;bul yO Dy02izx`e|W޿ʂ2nSΪ(aajURk쒫O1*~O'i'= 2X%Za J~ *bBqUsZcGq )'Y(O7 v=4 ~? MZj&@ׯP6Oӽ.`-lo=BӉԽ"t(S +Eh"Ĕ~E֚c.宸X*0D*c;5 dA\p2(d(fmKl-;Tvdy a50#f\xZˑF/taTl ڙyHQ>iǘY>9^-QN;* L/M w֕zX 1knE!ŝ+cE([ZAv @VDRw  [}# w>YݵA0M'ܿc~ꅋ5 efrUff*TúFX-+ \1LTdIxA f 88`,ƥfI`AЯa~x/Lͱ2?dLmLFVLv89 TOOeP*&Jw/8qbPJ(5zG)>59+vf훢c^8hISb{s#os. rClDž)fp%Ie/}T˔ O`lr1pmR7܇ f)803ơDY%A4TԅK´j (nF@y*5B<Vj$OOMv2ٍzƘ:$Giuq @Lh@NwYYK'z'48:h &9伿AO# `aZQQ 7|--}~k9N'Η?y<ƽra ŘR{"{ZC"b4lS5)kMʍR Zغ"h^,A7lƓ]ëZKj#*s.pq15̹GShxAiyA"Ok?D1Ɗ~|%\%\{li%gʹ-Rw=9cִ̟,}RSK5󖟳),k/:_(:ݱÊOX2qB5v+Tu֋%DSWgѷ9"oT z[wn~u(#<Mԝ/NB}.,tۏK%(褒2.XK%1G~,,)EScĥpQØ&XawmG#VfsQj%uRU;rro3JzXt}vD=-Q$ 4qfmcN"9"ys))$8]O<Ҵ?y"vL=Nxޓs(3qT<|[E u(@TgG98+Bw$FLP̩Dwlx-Dh|gD;;$.dg=Y*,gŰ91$=EgϢ;r.%vH³G99ոvvJuL5A8Uណiw9z*ٓ+H(.R휜(fIsyGTJv-)ݤJ;A o -r0ԛkª/媂iB$iَrWq3y,j ֌}ϋ Bqp)u}4SK;;b`NT=_q$eA#$P&O2e;^#&jgmF,Px5 8L$(DiV&fv=/GipY|L͹K&9N_p##4[KnZg=-lhĤ1B+ֽj"g!fPXhsdM41XGO܂2,mkw[ԇ ǮfAtJu ,u'~Ebw"WZQ V!:7ÒrZ#SV-=v;%) T#]kmf8<,wqZOr4ݧbXx:[p,Z7\ /|>X 16aM'4E03HǴɴ3FC47r5ucO?O'=7ю)dB۱gы9; A]:KBƵw}\Dڥ-:+<@X@grwhZx)fZpAn@T1=OGpMD۽}lGՔSSGs%QT* nvѷ?{_ $ M^mΪ6.Cf: *k޾}I0zٮQ3^kT4iTrBhn_4'#тu@A!B~Ϛg8_+FJ-ysfoBqMRHP2qb.rޛ|UfNձZLk0աC@ [&)yM]D#u2:Wd`@W:XT䁭ו s&[/ނ״\)3v_tRap sci_A94̮X?z|PUI5i&m6[ Sc8Dy˜Sy(&"CC'CEJle" }51%~$jJ] =. 1$ew?Iid9eQmC0.aaubv!ҽiIck-3]?cz1_lAH~#,e=+V\V&Ѹ.+Ks"Ӳ_oUq׶U[3V%ChӯV仼7y/}CL")FЏg4NLHBaB1 #@cXԦ_A$4[#ɤulzTn.h: J#) aH|-R q$[PK[P[UfA %U.(gdcAaArJ݉k.=u}6EkʴmCԇ~u)~(q%A3t]TiC{n;Pkqto\,W.(h EfS+lݺIdd27J(N~ O8"é_& i"BD $L@3 tjb 0uZ6ˬhr AJgzsd|R vH(*4rʫ_U٫fd~hX>Y@̉]9yDbT&q~1̄z芀W_K$;lF҉v((H/X+D 'Dxije58+R- p{RXQ]P3SOo.`͏W(Monv-w7Nk8q>{WξmKeL3]ܐV"d7P t|ks}$5fpyhOp1:XlD*e4 ;_2^R72`jwHM],3H /4(Bd@$ˀJĘJNb#W\\lB5@fff,3ٶbMŦo|w˻SU^{;tQ]C9nPl_x Sٚm\0wϷm!&-ʳRڈj:bo^5UDQQo^ uZ^OA[;X;kT덩0 :JԷRjF[uO6swz/@n/*"Ju-J}#aCzH'U)(I2 p)!>t|$3)R~Zb0(YT%l'Mh0 =Ge5oѫ+;`}!;rm|-$goתꃆ샞WT h& >_ɶ-P-+. ~Sdcx8c gϲ,a!*'͘&RnSIlMbʙg- d\A]!1/_R|j4Bx;/ @d]r:=bl9~:y-)F`RH2:@OpJqkQ&]IE'Ɓsz%ҨùW?ŢjA F%~IAɱqL4*]0f2ZKb>tp}b{~RH˝/6*;pB;U!p(paĘ#L%%`*߁p5aR`ZV^^+qBXEk$V-K'r1-"e*?E^ *`F S`i&ppVҀFJȽv}ؔiS5%\ J`jqG(wTҚ#4c97x '-Lcp\yY+]=.e\cRBWJj\6(>tn'3+su)bSJ W5`@BV zjr% mZ1|& X @6tӚ  v7e+JJTAޘ qߣJgZ^qob5ϱ储)y\Il>qMI 'f*i&mIjvax\1+:fF8P"DhJ[-Tօ|"%S{S8x-)G6eoMhvBBr͒ʞf#햽~Ԅr͂w!cQis\Ϥ -U-o8@ bblʻ Ja CΒbZ1jNUB?!H%`/)%&߈Y4Ҕu!!_fÓ_IIf i+F;Yid\Bwf;\Dd6U@Te ˃vӋDgeݺ\D[W2:)o؈x9@/:krqN%e_-.V ^|Ւ RY% Ŵ,A0-k|m6//ϥ5NFBosDE 3KS<'_t$ iXք-4OMy0E5SJ2Ob$ $+0W$61aꅢ'6jx&hiݻ.D>9|zpzb/3|8J8O9u=^a6tmX3%=l$֝vf9,fMiFk(K#QtH#ULcՒF3 ^&0ɯF-$cryt6G,9T]$+,XoS]! AR\UfBFS* &49 E4GX[m@Pim It2hF"Rz8Khv@Brm-StbG|)- G$vD"FjjL^S=s 1Oړp 'KHj <-*lDD%sP`ڪQa|%oG="9('P<+ <4D`B!HPtӵc`![p2w76$.T2,])G˙хUK}SH*\O:D@z1N>oPΎ ̚Q}J*Uѳ pDuH9<1+J+s=.>h;MxhED3DU߾~vNxD+GS "zUf]M%q'o* vp1=ϞAPI*{~/zX,-8ϖag )lv =AkQ} `Ӎt;k"T=2mЪbJwƳ?=NJڳ%p[nMi罛p̸д [6Efԁ}qlKrc+$']0&KS%qȽ"><}4{`O4si@H3mf< Pbd,E'+xRiON񊛜~`oclWX.=9b(bt@1AL Ġ?*Xࡰ4$,`mZ #D@"ąqB%G#aB`N5!*׈IѝpGLš\HmK:5/"GbKL9 Mm ㋓2c ŕ ^kϱ.j$@`K5QHrQϜ$4r6/\TB$H1I`E5ѿňQܐPYғ"  GsH!1!F1j,$ 9*k2 Q`hSQ $#,Pp5DBƯ>|*KDjp'1Yhy) *PZ JHKq+<j?2-)') ,zR&#Q4 ,N@4vT֞_|bYf v~=559 @RQxV` bIy,:NaieW8ϣ¨x1};? Ś>4q5atg͍a?OOv\ncОi_5{BLfٳ< tdKyPױZt"R3R4l Mѓ_P]ٞ`ufDƈ &eǠ)UUjx栥=!1r\ίO':\֛ؕ&z֛,6kO '1)$)woX\isY+`|_ҜP JsR\k:-i>`v\O?#)3Y+H9w#IOoLQnAmm I5}LYk =AS/%Kyl &2)]vMSR)-IqV"xJMRGX;EEhW߫jkW)fM& lm9 K7:TvMVULjp`/o>͏W~[Ŧ7m;%m9wZogQ7뫻c#\9vy./Zf!][o\7+ ̶Lx5V2`</YRtI ߷ݒ"4MnIU*!?ʖK]YJ4˹]1D%\w-V|5[Et  1m U΁wV9c)+/Ke}y~LZw3nYZ\0ޟwn;8%T\F'nb77*{_CO{O'o}8"%,m7i~A,68y{ZL)c?l+\r Y0CF;I _l4 %CS5v8ᎪoﶻJ2XJٚw&dΗǹ)jdvHd&vH_ASǣPGŷ{_s(EK,ˆOKaY" z"'wO?,4(_Ex-ZK6#zSEUtx+uPCB\ۊ!f9~4{lvvVM{ MT{.ݻ/f3ؗp?iAjFE?).ՆeK< 5V xj/.֊XcϟA)2Hʃd$WW>2\^y'*fb,YqlnD`3yJ5Pu==?Rz뛫I<8Yۛu'ݞ-\^\^˴09 '>\9|oN:mO+׫bIUoZdG^E("*)5aPu蒪c}FuSM|XѵYr;pA߆^ޕ1c2c2Z);hKE:eS2K9NMk.# !MKduZL.f){Zs-;)s$m9V.9 .k u=6)L;dݞ.$ξߵN !!GȏWf^ 't&?e=LEk?k2А=W"Kc t2 t+Aꔎ ^Lk呺F[*ZSzutZ1ѭ9S:Ft0H*KF[*ZSuo_#t+e3՜k7Rk#y"e"q RPA *e*CՔwaL~W?Lo.]8:'쥫ttkF HRr\@M-~̫K 5ZJb#bM_+T3=`~g}\-ϓ<'kaq#!k2^ANq{9n.qWaS|yr@RY9w\>|ʔpLDT2"y"$BHscR{8WszM>r tד) _hRj-!CX_<:e}kT c0Jykİp;C ՛aW~;<ӰkCC\EtjMUt)Wʃ) hr"jՙF[*ZSRc=ZJy:c4AeT *O4А=W"ѭ6T <(_TmwjuQAXg<.$ c RtJ֏-beԑE("N)&BZzs E"Q!4 hH4)8!%N4; s]Tۊ8{{q7wq& aCCit4kV^ v.JEwվJ-ImX n8*´`0zCGڿ^Ct\z 䐁zK ޯ޾-D~քUZyUͬv=Srw@;4:ypwwNO;4IѧK Jp?9ORq~"h7ڊyV` Ұs'b9MvK %9ʌ.{4/s4>H(Z )jEO۴W)l 'mac. IlK^Pbi^? ^ ^HIaxU¹b?is?m J i Jѣvx/ڵvR6KAVJ%1x M4[/*xYϦO\d(j=| A>|lSҵz6(5fpg3XKhS6j߶h~ݧWw4e HMGqD]ܜ半FwZMWUt36/O]W3@RnZ%WO 龫 s)$˓GJf](@3[Iw{_^//G*߾xџ jTM|OS5i~ӂ񓷸nR*d/^D &7H̀Jr(TDky𹄣ҊejsʎGq3_&?h=`{tP߂yHXKHo1dVo.f?foo??}j7я'C'37xryu\1ߜ?B^^]}rvr!7K @\8smǓGAS.-w3!UgLhTY7bH ޑ^6e5_y94-a<OHa)0]+Pݎ^ sFl/CDh}гM3@S; Ӽl\e-nrrrrVz 6rDcVZý\X+/gFP,l Jg%gM8h6'nʐt,޸ O2^5'w!#&(i'p? "RByUJVFOR8S)0@§7>RX{R3s 9ϑk .1Os !Sxȗ h)$I9|mPIճ00ue?Ȕd0iG>Lozr*i7W2ʈz.=E)jO0ޤh@G8;G#2ˉXMk\/ߢ,,,¬ r +tk)@3<2iFf,XQ@%͎/T_.|gUNmu;_cNup8j+ECGu$SG%OaGh::f4bSj~zbSޛbV~.a!1pwS x(QttTFVןx'9Ч}sP㉛)gO ҍ҅(f80 AIBM#)^XdbeCV6I9F>Dk@(haL& 98΄cWTSւ&:oу,L*y0NhX0S5X&PIΓ9=B+FS#92 ! &X<ljvV'u3 qMK]D6~Cn7Y$fiMbV$;n ’" H*H֖UR>P)+(pW:,qc,`ٯ&F}u0@}_^%SQ(Gf640F\H+2E먲J[xTƮYWB$%힄g." Å~8Ԗ1qiIAxj(<-0iOAj*SK=Q8H=' ^pL4 Ԥ,=KYFgõX-b#o0gŴ0`l=(}nn9DІR[-'Zqo9ɖCIu$AmCx3n4#Zgʚ_aec6(K/ӽ^{^ c6%1}EJ,RbUFv"ė d&myY(žHj>E I3T<}Q Tՠ}' ϔخ/2N fatԂkIf3i Y(H4(ԓF@ B,wH|Z]ABU(%%:萆BD[aaH Q:b"y b4&j0t;KF|eG<jp)U\#H|t'T1@0 ). 5E;xKұ)lEȤOIӰ@ ' dԫSMfh֣M0;hkw#FB(5ց*CUUuXԃC[7"x'lځviLP F+9 +KV9p KQkLHNezJUj#TsYd zJ Go41 FYP /*BѣerdSČݫd|,1uHw^Qd4}6 $Oc&yNF& HKW &x>M # LG(cMT!A|'=`F2t3AfGH=%1kT8~ vK=M^]3. :2 Dɮj;\@F/+6̔}`CfjR V!akO;m[~W"~yw|omd@d& ?р|g7Xy,P &wJn=+i2WnΑ3sװ7£MU]?+2 }dͱb-xA*xĿfe]_^eBtBH (.eP, ;oo)+(ô}|/dx;Ƿ]xѐ?ɢc~ XK w pucA%+h M-,D0_]p+.ߔֈg:}+ptK!2tT~?Xcm8* /pGλ`օzw>;GQ阣Ryaݶar米pgK0yY̫jazr/~ .ک~#33{ ϪŸ[sPfq'gY{gaRcꙎ*3Jk#̓<ٙF$#u.M _;>kt0|jֽ)#nRDFBBr3^5=݁jPh řN@{A0\4 7p`t 0Q:>P&lC4i+1 c)aG0q!(MpHªk:|} Z[QkjAiu{+=)D ;^1*P'䈥l#ՊZޗ*v6!m0zs[ulZb>&k<$έJ&/ro7xTSM}k ٍjHmXQ&t^gaQ}f|9yZ+9Rm)W!N-9bWqխ* $+d݆eP$\DO):PT**ӇuN'0z% ? 7!T'i4L `zwOA:k#]]ɸ^}Qco!YY!\(Q?4_ nZ¼`FH.ʟ̯۬zOյ"ww?J T%VBWm`r9FMJ]>~ vN?h-"ճ/u>p{aC50 ܴ#sEd[ܲ:; "-B[>P~˻OmW^Ѯٲ=`vu}`&tk *柛@ K@!DLQ V:UEfqK_,fELuWhQDGy[moshC]~ՆWJ}NܜGM"!F!RSc_cep{Yٷ`)Zb.7e筹##3C#jlt{+DvͣvvFrI7gj N\7'&MΚ~`I\bw3ſ,en4,!"GI=@(-QZxFWJ3FxK/NRPKpl4^PGV2Ldm;Pmߑ)tC=V;3u`IJ0^  F (̊`XQc*wv?X!Ab9Ě^Y[ˏtX:k6w'y>x)AH yY50ǂclyDކe73Љ ؂KIgcmeuW-yNZT3Yg:xyu$I8H]4_ koFiJN bhz'VRˈFl^J6V Okz!\Cl%56Bqpli6o8jM-HG~벜_$AC% 9JrTאu4BY*y2HHJ6#3!@9˼ٮQہZFmߑ)lHZͥA-Z% 1M@նtHu^~SEYʘLRhZ)"M@Fm8"섋%ŕCMܘʹy1s{z#ǝL).|`AE\ 5`kNq;,X}+óo}եog7"wHe n'$WH8X"JFAWXDqdq梥Z9w()ߦ!j6A+y罍k΁SFcm Ai^%cI"ڑ$qDA ܴ蹐! Fk+mZ -jM9ֽ}t S!bϻo~rC9_MA.qzz]'Od=[pEF'qmL5κX\hZӦI]66qDXMΛ x~/!5ήRWN uteϟIOׇTz"`1dhqq2i:% -QI(ġuJ"Bk~>4 IeSGh SѦg"OQaE]FEKXETHϟj0EK |hu37t]pDngaryF15$/AWE_եr1&l?qalJYbtP-brydAMV1XMn Q2Yc ܀]^cLEO !I7i'ky3y+^z7F sM؛&y=ztsNuW[}Ԭkv݆ +Rnk)/(muȍbJPi=*gU.ŕu]`&ρbIBD4<byY Y\'hmyN՜Ժ6dkdٴ2oKmDZ8JByaE9뿌(-ŨWT:=A;i}Wmc\%.FK:1CmAVuӮupaLN~Jg;U$KVEqX~]+*sN0n('t=vSa;#cDDЯ'w\[c$'WS^pF]`$(cPzہZwR%-]%+pj,"czg XN:J$&84/@r>Trhqu[ YXLDVPaZj^TkܳR3pj%})jof.y' ӋzkJ4/[SpZo=wVC0 U}%BI LhT"U59WHG4kA>950.ϔƣxQ7ޜPj-~vR(R6 ]kucx.C:BM_h>D{?YnLIfʑl\]񜥓bTE . FM Ă!Z8*C}_D$|X&l(RLwrhƶ3/CxnJ9JP:kI Цd!eQ]-S%ZyЬ1ZoSv̴sWx&|Cc.wpˆ쩤?`g u['BCBg+:aay[:>0]71@ag], 蛁#QӹDxagjrFu9$=Cu "Xd, y2 <n^:4h sx>> ް,==ZKFwQ3`9(*UlʃigPD(4WSM.kD 5Q:;ghGR%pO&FLOFFJ_WPҙh(QMA(~@eU(׌ʟ~_`WkjԺ!vpHgsC܆Ӯ/@5S%& A\A[L~ ݌XI%ӗOhHvqŽK1ⴸ*G{OxXޠ&1C+a >@KMQ),qr-6[bm*ig(]t^HNqRl~m'_dS(ɩ(h<ÈN9l('wu?rtGˉtGnN yq\Uv9KKa*n;[i6%i720s[u9*ȰMj'2RD\HQiyLoG뭱ErdBbrk2kDN՜+!BP. 70H(+y谳ӏ!H߮^eIaVs~}(ԍlz2GRk4lƒ4=vUV5W3䧇OFĬn7c ˺N[/w ^1U~j/Y MU^<͘Yf N^HAX{C߿ۉhlG9ń{7f^KrXUԱjX-(R4 8nhSIJ4i|us&\ZMvc~#ګBL8p$D^~M1;0BV9'ܕ85ـ.d86j>Oy<Y!42Ӹ]qu{rN%bY1?^jQRtju5{PUv\'bJiFHQAy!F~14#nNacDV3,M_R- kuauᏐrMF7_8涨Mx9quCw]o5* r c'd3@9܊BC Zf s|ПCy78/SfVӊMVvY@v`b*mƕoGetD#Iv⛬bqNѷiJvN 5LC2f% j)ܳ\;K c(2'o)Ӕ<\fBFXT=hɘ1J_DqQ#!D>T`aA¨" sIN8):M دA2*zy5! W@_9 tp8)8Vb/sCkHGFr׸@$`šRRa,VY]nPFOЬ*YHqUBf^rgdM `Eιp)g1-a;ӆ II)W:;-Qj_h^-[+P AFrP[ ,k'ȇ}zCRÌ⊁'i~r^#tYPs 7޸m~)SpcBܥUO 4U]Z)6vO2:H!QfsT).|X}'58S3{TdhdJ!q4^7Kͽm'x/ <\db({m4޽O/' !2AB 9Wh8'V1FDk2i f5A|[Nk]*n5VLa;N)%a&<}i0U9hgե,(~[ny&G>L@N8E;,h~lqc+6 a9**1Y1]BY EjU|[ )er)u8 NVLzX&Z:. ZW9) |A(qJ80f@5EL s+q CW4l;ۄ\6y f e!gH`k=8pe\p+]b^@X.rc#8H%E7eG#CGsNf*ϝO儡Bakoh,rV [i8YkS4:r=Fy5yg5xwnSJ۩ ]c^Bu : ^ ex#Xl%Jw#:<jZl:JMc:U YEj:1et>zѸ6ޥx/Mc^nݭ9굑/C4; lMI2gzivQC~BQ;fRsaQ/sirsM6]Yˠa8 X}Wエ+QWB_ՐCп`sr 'pYxyvk.k!QZULo3s;M^47m->i}vA% O &M_J*g]81 )})z5s=16?'|.H4M4PG( ֧|apk6D糜'8UݽtEpl>q!H<{0_gEU;hb8OF+(wR&](*4 WCO-kV;4!\Mjdlbl{Z.x~@~WebwmU!HuIQTd QɨLKHʭN'&*StuRѬS.3V(v*8kxՏhٍϒ IbO=ΩY;|Z(Jΐ쎕SQ[%Ƕdl(B*uΝDG+|6:*xJwt>?|Q?;ϵq-nF9ŪmsYi(Z'DɤjM؍vrlϑRz(9#'EQRqvGI(f%?T0K7o&%'\vp As}*gɎp997n*,yѼQsVJ9rQ,;NyHy:e$kt[ۅ$t>B]m~GlElwK y;0ts 4)78yM `rvV,Q%vMF)Œ QizXiړӴ)14=Ns5.ە蜰{JV`àrC%YO:TwI@WhnR oJb'$ph\A2@& YJ:9qTeT+,I˨[]\y cGX=dOvilHD(q!D`)(q|?bDR@Jb ˟q>E: LD>~T9=穠%u8Up9wRxJ F4qGΨPhJRF1#3QGx4~i8 qS5bD ɠ24<-$ ٚbVI.Ŵ7BQ*.REIPm{G '\L&@NyKSuvv@6|UhΟM ; aM !Q!פ枨XB H{Z"<tiRtsNL ǭ2*?jSrd(iM ,[NKk19-hYQj )^ j1[/xYx$eS$жh%373Jmy8R V7~\ޅPdOui,"N>N`I*UmO~$AkhhXbʠZ`O7No7.[lⵀOTi8"ɜ[jVr焓QlXIS#|1q.ޣ&p!t '7]彋"7F?M%=sU~+2lk;QnD-e)7p,%ŅA),u!AHui}wilO0(S>^ _3^ȔYgTI[! "<'ܧ (CpQԖy }laҶ#,dؽy ?A#żciI//it>kA a* ae)DP+̜^W2 (w} eX ʰsAg,!J 0aZTag?wA@xShŝƊ#i̼gLY#p^dRbD$DZ@)d+in->+YAJ!2@6pTJg 91^;9HI}äEۦUp%N)M5K;&^x "% ǏJ`F.6z:aJ{RiA1_4[sW@1 R1>N!W"9eE3$2qȨ Q߱ f4 ؄b" !I" *ͽVqTLB#`g-$)JIZ_2IOɖ̌S|(['S9E~%&n \W1Sꏷ{b+?+Xbmn?egl>akҊǎ1&aRc0pa  ZNX H( "Κ{sMZJ*{sM{Mf8){rœ6vդޓı7w-GkceHpp quU%SQ P,@8Vıkr^ܬh;ˋ5FbMPgy&XS]^\ntEzSheW=Cu3]vc]e W]:pi|=UvcW]e β{*gXwvƮ;ˋ Ťfy1!B舀 z摤H1 gXO@XA-"Wq;IįuFy1av0": H]b"Ugy1i2,/&t74ˋ)j7GtNG$S/eg翜>UnL .8NleHYqYeո|'kH&{K NR' u21_jh};IL ]^ͮSRs 99F/$ih5lϣyRSĴSdұ$Z=8ߖmnBsYC!_l9Ny|taH?54p~aw;_M`6_=ɥC<,|6=Fx!k〾ws(^'H)2b,~g+ anìanu%UpB:n)6!o>y -*z][sy蔈E*EVtDSlRvK1薊AI.ѭS}^감hMIѯ=nAoR1b:%5 :UD,wTnuXHM4ɦ:M;QnT1{ݚǀNi&5E^ѭ 鸉_{4;薊AE{X6Tí%w V]wB:n 6QG ifAT=v6u*jUk5XHM4ŦBG3(([*UL'DN;+F:,&dS}oI)n0bPtRKtk_lE==[qM)atDtKŠc`_9Ū5E]{[qM).i܅}uzBmo({:=RڶUNte~lJ!ٶ=ང󾷲ﭬQm"}x}^jYry<@PNOT Rt$NN;gJP7 ں˨15:=AQWcVJѾטk]oMY-טs P1C Q_ck5zԾ1ĸ15Z=A17XcfHטs)֭13,Ȏsj +[Wcfs_c(!13)k}NOYFaMk̵zì}b~s_cs5fƙ}1 =2s_c}5f!k}VOYk,WQs'j̒Pܾ5\'p"Wc טkEdj טsnaYQs_c1+u@Ok̝1+%Ij̚ט|}?,JGƿN>$,GQ/Nh:?azK!(bգ8Qtؠq rccwΙYH*0N`Emn@Bf : B4o "Z#K̝ $;!o;oYH΄ aΑrqt?B0<^k0(q5hDy9"SHKal`4 BX! jG0)B<":a!jr.AeqRǴf&:: sfB \ Xځk(cF,3s0HX(JEBa"fVyȗFwK &SRaXB @M+' ԍYfp3˫{'#u j TbHe 3 ex0a ς9ThvBp'.<յ_[D y>&`\?p$/MTZX0b#a 8# d'[Arͽ/*^/1Xn !V! dȖacb=2 ? UB5F`A$ SLF"Zi1 SZ.)!VX` 3  Yǭ3r›|gZt{`4p4IRLlPp5W|vus15K?.|6z[f9?F>~y2k%Ks}s:7#DGw#ƃ!o!Lbqܗ!GşBߙ\; `\f(c6B,/O&Ce,^|[^` .`>|@Ǘ>%g%=դ|L@bMl "TYqm!9aTibX [.(,sRäxȉ˚RA7Xg>XoƷƒoGvXP׊ŹY\EBڋ8ߍ!h߮mp izm]X!勗hpnT\w&;hx^w>?$ZE! (FBoijHG&#Fl0(atn@dE%`-QYǝ6.Ik\[yJ>QeAd~cqeLeAfIe_> kEfjV/snXyp}w=iVE\GFQ_wlw.c{_6m?l×4yc Ok_ i۞6^M}䓊Þn LfK[Yl$kxj"cWa l<I-!6Å.}Mm`wKqfuqmfr atL_Kgy|%ױw>oqE.ӡ.Sg׮_7pwT GmD IR/6EvGv`ar ȚSL x}8OȬO kHe<~$'>VO5HB/[TG;6we !=!>fEJ*^b(=vOKdfd|qgfQ^h0a* ϓEkKXŁ9*GRvΙo & N$9Q2(J(k'&93*p'ij!Yޜ1L0GΜGrl}+4Te2g!ԓBgdc4ciw\Fl,C2BH:66%!98G1NRVh5%'WDoF5PJoƏDJxt薶;cbTbФFj%䀷cc$" ^Z>J)lR Kc-I'87-!\XAОs%Z)"+㨽{TKrXȯ/֊0+ i=)0G0bf^xDqz,cH&())0: #zӎ3/v /lp&xL OܢV9%PR:\9ݜ;Q&3nDyc5mFՁPQݹoW¿{}{K# U!T}޳\`Xnk]w~Dsp]`|,oD$B,ӛğL;C_rwe~8I [H9w;uyItŁ97J43[ֶP>6v4k6o"٣`].E}Iy]`]El Őhh+T16DB^&c,`"^o~E9!֝C@De=]N/;y2?Ȣ .y.G!PTXiE">nq(&3K0퇹 #_ή}2߮eu`7%=QxPl-Bvi(ʛDEV_ a"wVsd5 #]kM…]ZBi-q<6!TQJ9ξ- V )$Gy{'2[536XrO)AL2E^p"nY+`[gOkԢ{M>{,Cfi1Qd1lʊ1k$.*ʊ̙2wvezC(5PPC#\tZq d(Qܕ u#gcy Wf7./#]z/e_-\rx 2-a3dEm?4y\Cyxg4 rPʶb78 PpW0$lXuuoؤj3]懲JSlT_U⡆ bRҠE0ඳ wm0pm Rs+venԮ%ArWQ)ǜ"N eNRxKNKM)?))z[s.VS[%)Ԩ`B1 O-7Z)_m1^gН*FF}ќ5s?PLyjҤCJ J-t'+g&=xjزT>Hx, !p{x&2{?s0"GX!rQRa,1:>$' BPg؇vKQC@٥!zeU;H} zFuQJ;( j3ܶ ^>c;҃QB!kEG;z9d(.塡bUvNq^)I>#ID]>a羞!ۤjTyX` /V2?wmc#5{FLS n/ 냥J*Ydj=On C'6(<GE,,RҠ28)|0Ay-C/=Sw_FsІ8y{#otwO s~܁r6л-9GPdy=Ɠ,ii{ ;~+EhB epԖ43QZ  ȈhiX`S lA=yu}9siK٢(\DQ:fLz:yDwNJ4@JzM-EaDX zL X ޻q$Abd6Yp{Ԅl*aRl"GT`ٝ~2rGc xM- G1HH+,:)$#CFj7sm(94 {mA:T4 6G꼷zAe#~rNIwyZU]3j97Ey>l/G~UZ9!z/,FyKʘ@?#,!~Wx n2~|:d!k0~N!Ew?Lbm` "#[2Dkdcٸ;&EH*NvN3Ň%7TZUE_Z# 0{ݜ[ystUnX(} fl&W~_jOwyxlCdG?gpG83{Bѱ!:y赮:v6As~Ofy ?L1'`,[IZ+wR}^ }$]h2K_)i]l%Whq5(5;P)?lҪ_o;Uěo܏ǿ1$jeDe"UaL*`!0$^E?vߘZ*O ܲC! X?8)Qr*@!zB U=c0B>#b4ȝRFHOi=vߘ10\b[ti+J@.JKH!9 $(0Ir)BzO$bT6brh3R`xO>, =ʪ\tBQXHG' ٺƚ(S Tl]N zXgLLQE 59A>J>,zJuE4} ;߱=nHs!h 7| |T5eo}r;i™Ǘq #4F-EeQ_rO5%}^$%5ѣHA4M F;OG5 wkn\jwunV1i҉ &~SXb1<yY!vz{rO/ t|M%9yQ̫- 4*?F?+ >E3,?nGq~{z:R_OO]\|z n@r ֪$1[~,GZMkZ}5`I=QdͲq M#Mꇈ՗-[|Rܾ,Bp.X㘮HcL;v>]݁4PkPB-1Nk bʕtFKIk9ZO((ʄ7Ku=U,B@x,u& r(2gת_IKMޡehWCw}xRzV,1@ J^bsHjLzs0ƃ}15B`o{E߰l_Q)P@e|U&jL(gWhRFOo?`FwfH-(ml{S˺:P˺)a=.fh.o5+᝼Dv́`ٜ/WIwd |-߸gXޤo*-qgn8<4f;c)x7kdBU NǵX E kq{Wt$'[H*Ywۥ^w#k9-Gj&0f ͐ h[_)blM- ysW>kL, 8L`[PbRe.oʙ~n[fsis p1$Cg pWs,\09 #':#'-er&rA[P( iKK[1~Dfhn%ciD0o`,q>d*2Vǽ)LRJҖ0FkH iկVCMزĠEnDPKsW_Lih\$<,.|(?IW&l#f&clZyiɳeӚֺb2#<'qF_BD %D,J#K^)e@%`RE \Jźlꦖd^+u]ln+ڙǷyKj? Nt@^.@HQlK_JVNi^LkL`YYA1@5j;׋ cr.0&u|@Tȹ4ݕsn/B|v^`>lx*)vY{ G jpay~ G_VGiIgH@h&Npgo{Z|\:{9me(rTT~,5>T6M6LcTrY`330 ## >}iAAotl/  skC_+|x*o"'@)LD\<*<+E30ryuK m__<J>|_U\yZVDj+>ɴJ$$).>DQ?dc f LP~M L僞:<(uܩ/$Ɍe*'e m%<[z= A 'يEq.@۲\ RxywC* yNv'5hO5 cQvEyq:3#tWFCi/gX2MyE \sHŒ1iVJY-l6ׂM$ <]$K;x a6KoDxɿm&,#:d*hFG HVZ Fx!dӜPUXotKbwɱdI1jBB.}(yny;*W;#{ hWUeᣞ癵KJBEP& ($JD+ ]N%c֤ދTnSIWFL%դ/6/fniW+gk<*d,ЈgWɆ Ô1Ti7?J6MbG=}9p :f(5Dp؍=}=`4+p<bECQdE8FX؋xL[66 :YQ!Y3Rrhj}1S/]|4J[%ICuw<$HLgؖ}'6X׳kMo;BG(w`C\ۭPOͩMPOҼN6+!(vt3@qY F|# ޝ(5dX J-Iʜ,9qCJP2,3)_,}kJcijӿA F,qD{dF 3b3q WZt1N2LAè e-Wyʵ> M~ys(|&0>k@, \pe[vJ9#T0d>nGjB>Q̓&OT ϏTnn<?9鹿ts/UM,|a1kl ,}_13SXd#7h2=eTDo"9"<=}$,uΩwk,z3mj$QB:?ѷUO.׿&ٓLPƈlZfg&ʃq289 `HRrkӿ Fr,v X&d-]s]D' 9l}7Po|mGXpܤ2W`@. 70HQ#؋ɓ?Xyra2 =>w(Fl_(OϷA70/'j`AL5߬Hzar-isݞT13L?>>TºH -aQ?]tLJI d)KYQCpj <,xpHu.[|s$XIQg.N UuN2W jJEa0>NFY@il+_'n<nǛE+V溶d-ܭe?67A6-kۭOUI8}`OO Bs =Fp21ԡ)ӅH})- );[X?8.I 2N >A랾kJ1$eʤNGRI & Nل{FzIl"m>_g ]CtOl{o9YvԌ4V:Zj#2mw 2Ǭ(72ʀS8d^u-2[fdVyi󲕌́m] #.8[=u: &lՁXqc_Rݶ" OTFVvYg-Z=T~[Wy%5ˠe.IHV%Y|f2/Ӹ4q4 a: eYɺ0)oRmm>9,dHQcB%Bm|~YJjIy?;ЫGҳW/8Hh4.Rc_'$|.WFLz#)OE U[W5YD&1ew9{S7dzdw 3:?zQA\&6_WWƙz;R^"u[)j~Z'uϣ)wSfAuv)K{\qq7撚ܭS"XHH#Xid2tQ$ bq֭2R& ۑVh<ɟՋc\Tkr@8/U *&SM2JKgU9VEt*җ .09$vo"$2j낯<`RdU6qCz"hԋGӂzYw Vtxh_6%mG( fd,jNV`I.! kkex.E {!&fXt>~1';2EF~|ぶWmk?wkaK)YKQ wU̒^eN*)K,W>iƥ Ն3ʹY_)[̈v/. hMQ `H,QcDI\ (a+ f(PGuQ?tԡj48Om^)Hm|^@ 79%(:ys.`N#Ͼ.0-082|0JjtjCJv2x1ĚwV. WRmKV6A'8-EYeNg~UΚR/zseduLE/z7u? %ETe!k#EZY-F!]c+\ex&H8޲WMiP4Spt(@o|!V:$KmPbC3*JI tH'/MEä:Vl`X'y}݂bjl=8iݖyF`al0BZ:T0R`H :̱"F(.CdI#)u2xd kxT*+[IEWFLM*~`ɓJƘq2D\ i^ ݺjm0;/1GJ$+uR2V$N]݊_o!y0h3EC0P/3-(W JNٻ\B@FDr*KJJFʺ!@P2rZNJB IDRժ 6I,*(&?=&K SQ1y9W"Z,T'mB)+? >dG@h+651*0XUd=mɻ3gM#*lp) 9) 3[VٍR/QR X-]+0ƊpeL9s} d,P{=9@6_1hm&: ?$!`cr"m52nR:hER3O DIӂ%c;9!scJk%Y`kaLrJ.]|C' Ԁ!xQ5-c P&+#=]Y>0aѾLZ`GԮAߝ@ÒiAnyn.V7<$φY!$H({-])(ϪB>E>E>E>Ml*Ud0Te.NI;H!`mP„YVRY-mIq߁z|og@NXu:]N?WÂpD >!sgpL }{SAA/Q%qy? gW_%o؉p514 c`#i& %HGJ'#'4xZFOK4m?/4XyF秧pTKIBKlΠ.et\dt\dt\dtܔѦ1UdoP4Emhs>j询-)I.a;P/{ 4 0a&Ы.cժ-PO7\FI=Bil㦉>ŭ329ߌ`RqYDU:Ay]8c_fa7r-AR;eZyVw㓆WODa39 nՇeirrf lGW#Ȩ|w7K?ܤh#/h;ոh?suN.2Fww'WlA/۾Ŀ;~rI(>y]k%yT-뫺mbx7Dy6~~>8tKqmq,.zOC|O&y{v&'au^5ۿ=-IƑtvHrb0B(+?QXrr&w:R5 q#$ad1ߧ ;woLɊ^SRxՆKՈeJ4>O~SxJw?XeN_}-i6ޑLc{ f | cVv[OCzb5FrFt9_ZYBfX2Qġ usw٘{eåk3ؾ1frV'#d :f/ހՄo87%2ԲU,MlLVѼiDgjۇ+lx۟\t'{5sY)Mnvs7{}-P?]f7ϥ+{gq/p3X}_d\3Mk 󰺎_YƑxzztJ8 grdb̓mSefJphRCk|&`'[R6lo^}dN,8s {n:eُA=6ت7-A}HM t,π?C:#^)6e+L'볿c_Ѥ,$ :o1 Z e̺t嬲[itڞq'w1׍8v5rEeh ><3l=Z1wR^ )r dKhddc>w_oESK^P)?iv:Zm6&ǶlrYmܮ Jm:%qA&X6ntkIh㮺:x%*ծڜdMoPǎ [C{騱Nr`Gw-==[nPd+C XAX΍`P^u~,lj[k d1D˜M TP8(k-$eAC)-wR@v˕\c˪K:zc>zLmIIf:HUsm 2+G+Cu?'?d;z .C?(Co .[=ܶ }Nơs԰t2G],ۃ;nq+;=NQ_ȎQ&_l0o?x$ Xex𓥝`l6uTҢqs: k-'_º"2=X4/7I_ŌIV,n6Fi88_qfdE"=#y6ɧUEVpsq_%goWWfpgYݤI{s1fP߮lk\/mO䜘5Z43TgkLKANOB92My>#țD",s_XJ9yc3n|%h;^VىnJd)'$#Z+'*2C>rҰwёoΟ|5c2mvD9 ܡ7 j?E6:6@]ig'tQr5H-8W$j4iAp]G5gU=R{j=L2'aT? 0cܳp2rTC- 20JPWCA("!܂1Gs +*J` jFSe@cV[o8g-z @C\!y@=$6S4,^Dd p4IN1B5iJ8BN&4T>?6g턛s(p%F@8_cQc 3'+V/~d@u/I4;P:`wTۜN /,tǓ Kr)g-zݐ-} u(կVQ*#N!C"Fݩ5|ryBE[IAǶ]tPR42,J1R!;R7`wREyzw\hEEB~eZmR7h(4YE%(I%vife&*K1 ^[c+FDeKhw=J)ׯ]ā\I7W3Fe((_Y\SAj++o5*4%ZTԍֿLUQ"N`Mބ#U4T4+,T; VTdLcj0Qun#el+΂@Dܓ~0#ݓwJBsp+5mh'bj[PeeA4Jh Ps?Ɯ"m޹ΣK:.뒺$|YEβ[UQ-<ưnwumi]5,<!jH'1,}Qa7#%-((>)Ɔ`x M dSgo`ؒ A FT(6A#Syah\$|rQňH.̋y l.@ X*k!xEhR(C13H dI7R?D+j6n+$Ii9Xd^:1^N/ 2?apb+Je[nxy"ֆeddղ)jmp5@J JjYDՊ/P5@)EDm/fjZ#5J5&T%?5OL[v"U9%b^Ve%hJ rlV9\2Q-*N#BYht2R`Dǡ<޵fDhw=#gNQN,&˻IEPM#1?8U0vFƒtm$oj9ns [ Ș-(zlc$)S~n,>6 rG y'34䣧ڏPGyfL[;sǨ:Li,!윛Nt$=wc P(lESW_N.[_pRwRǍ ΓgD_; >r #G6Gsӫ2Ȳ z+r `+7گnĨ govny(Ԅ[?O( ~.Wl wܟr}\Gs.l~<}t^Ɵs9ߝYر,}Sk^iOrҐ\Et٣w[ SB{@VAꔾ#Ǻ0a1u+JnChWE:׺M2oY7ޑ8u+ EuJߑc bZ;u+MnChWuH5Zjo.$~\i$%0+G-$[Ix1Bd'+v_K`!O\W$V.MAEXCՌHZ%+EګoO.H;CZ['lm8㽲R+R @ ^=kʬbp&?/W~1,Ϊۋ?,)ǽ >IwpJvqo!U>f%Ά@N8U('-.ǡIJYS^".B?m(tGRvZ%"Q3)l:W9) y1е:揮@) a9#2S gs(?$?I8T 6k#+v\1.x3 b䑯?|^M f q9a&tO{ۯ`>žke|\d:p\iwop`jhLUVrVcێrWAUfwwR)#xg҅@ ! f`שּׂa f֡t"Gu*ѥaStS8("4g' gGi3< Ͳ2r_I" .X;Oɔ1lN!!E&ɑyM$rJiOZ 09 MZ#f=E^2rlS5vevt)H13:ΘrP8]x1^رԇf3xGyyxve[!*nו4҄CkPM6 ^Of-}Ql{k&4FZyt:Gsu:]-&u+6|PgZhyekm iZ^"-/}yo-')]_JyjRkm}xPHX^ jPi<sjVo{VoI=IZEZ,1+8E0}ޔB n DBs3v Um*j|h@5)*uӀֱKUJ{S/-ˁ+FsSAE2\Yۢ_%ō"Q{?JS6d}!&^/Gaɂ25Srͯ :B=-ғN쓚1duIQi}}G욦msX{qԐW$w$ݎZFt0KF0d:(o}+Y{[i+ )yZ B#*e1 F5Z -Pй՝Ģd9}dMxցۖm_ÖB+l6!hV5Pq_Y"6h ʘ) =fl!xyᢥJ1 )ln`2t%_Qqas&qf.$z||Mx VkFD<^¯:}|Zm:A=ꍚI"ߩw]_v Wxd߼RBY/w?޾:k/nnçec߬_}xbYƨy#aZ8|FWgWkG~TR8Ml-Z|Ls3HtXyy3jc WnIF5u0U, =V*)w8LӰ@FխV#M^meJJv$k4'm-'VrB̓ѭ(6zJ))Fg1Jem P`+IRQcLFπl e[;lw=3MLlCv;5pʹch{ !u Cԡ&QW˔&>0|4`L6(9jiwTZ>1BOu&+Bn%M퓧Li.|]Aɪr[NGѫ|z5ۓKq0fʐQ+fN#U38E0|5Ӕ)n%JLyžTN$NnF-- ΔպLSE)Ǧnk]~ze/ Q)gSrP rA@$Rd) AEҹ(1*T$VɀKuNJ5\V/G)MLbXٻHnW}9wKr6A ~2^,i+y g5uM^fXd*XLXO[}~9{eגJ۶Aٲj`,"k *AjM.lTc"'Ք&`+&<>B tHNyL hA{rL43pM4 SφA i jk-݀*΄1i5$_)ٵg >$k:ͭlְ4}`.9UĘo 'E,OOo% CLQn*N8ǒ]@ex iڴZ@-+kW'.^AjTɓU,Batyu۟5TN*]חW*'樄!š4[ 4kDyߠx^@]@ sf>T6s㋋;퓄j]r_ӛ_͈|yEMd.N?#20Ɨ~/\KAeBKsЁ Hf q :<;IY1<v0XWF3UEaih)EdeMM{'Wz `%u&JESMT t$"<\W9fa;j}9RwxPAtm '>DTv0*N!8LH.u&‡RP'4#`҈Dzgԑؒ;Mh1<%v0 VYaQ% i)<g̳_lx7ˌJFbe ?^_"2o 3b?_؀AD8+Th΁#kUb.0Ɩ]!/ޖpWvfyqכh2{Zח7#*x<@'CH]-"zFPt߽cK=~~fhʼn}N߯(4xc[Ui W/3 tt 2[W^QlUHՑ-W#qDhP f,Mp":\_\`oT5csORErSrtuY\e}p?;=sLJ\ 9X]?ǧEz%V5|x:T75@E-_,7|^Vrg7ԛ盃rNr].\gO3*wf> oo_]]\;xo~?: cm ¶pK^X3ьj|GSQРRkB !pW;DMuhmwge@0QYwu:ҬwkήHe$Рj_.hkߋP_d¨<.`O#MAY\W몤PftY SW2c@j߄K/f h)6a>X[&b>(* ֮$t\!x5.y>7jiG$*"8nC[PQD {nq<)-W xqKH{SRI6I1;~'.'sp c@^cK)G7AqTZ$X KaL:c0$3+С%P.D0@^q!ɝHsk>ge5<ޞdjiwZ"Խ5eǨc5;!Cc,2X.cebwWyvgZdR5T+G\ t2NKҕRZqStZ[k*yYZ>Ѽ.XpO2 4-7xKKtY9D>F|._,_e/(|fD $PKbn*tJv,ya-~őX$W=^Xzs3,L*9 79bDNJ^cKe6vaXQ|ps]QSGd '5M2 <>NHQ<vpWˡ?)Ji>xǼu$9oN5T{u3B?o!/yRѬ[ѵGW&t˫3dف26c{u&vx{/w]/4W3}˅m!CPzlo[A A+&#oU2ɒO8$|mrpW![v;V.3deZ( {33I%b/ׯhRʍϢ9E8kXj"xZ1A[{^XJy+> sʕ{ oM<[~~+kr/YGƃ:}=؜V$t_~IoW˟u/_ ǹx9tʚC4RF  2.Qmu. t6YI9UxHvAIhR*VQ1ww?͗<=ҁv߮\8~C(tpz3 zs>>8ŃO_(5vG =$>(m#nZ}()5m[QA Rkrye*`AGqwƑ#R@p)bY cܻ#~g Hl~:7#>J &/dx=6|7Ɩ)Ny?Uh+m-L~eN}rYzJ,< D ҡ7mcy!WC&FL.'Z tIWF$Il`ڐ @5%!kB!xzf#yK F'4ťMl&уlN%3 ㍭x8;Csj^ Us4z11i&bcB Ttſkqn L8smTb͖i-׻B }Jx˄n^ӹo_H_925\Y l"IR8~ MeC~pnh\dݿsB󼹦vOڇq) *sƥ3X3ݓqU\~zzkd[Ckd4ߦ T+K5B#%Ehi#^~ ;cf3NMMS~ލnIVwTXz؁%O::L~zwv>Ros\ю)a]TRBiLhb-EJӻK*J.:?do\ J:ϸ'e2F9U2G8L|5CΩ;Iġ `մo:8 Cדd&"3٘lD.8vFaTdO_ %PWH|@3?6e=53?aZ.W"}s!؏+ P`=kZC< @0o|ע {-`U;C:!Ǩ0ȫSƦY{p&RbB /]B$K*W³FQ@EQW9L3Yǔ ./2J_qͿͻ7ʁ"D:eܒ$En?v_! X hϘ5y& \S4`d!pY0ԷlK&N%Y%Y%Y%}}t*c&')|=To7ZoNyVHX'Q'e W`xPfiEQe%O)qkצ\V@eVM/gśI&B &$e$ p'{ D-·~>BK]Y?Bi @y"fr >*…;cCAh<ޠÂQ.u|KXRJ=A3<%}8#![sLT8mgAR*8xuYBA$A-'Cv Y'%u\h3&`\CD)Y"H# C&)J yP&l2ԅ%xiuR ;I$cwMm$IcbwfP/1c3NlzY ,mL_VKɦ[, U`ՕLIv&kD;OƢ{aYSQίF_8N)`(5 qCgH\#@%"W9e#AyKw ,id1H#+iT!oӰF),f#.}@1Kgf)A&@%\9&iAG(IWˈ` Y3r&6LV% C# 2͗.'9_5&]`@9_i &Յ֭2 夜p+ҝXWJwb]NL67$,8 t M i`%ybeꙀ6,pHctLubN,׉e:2zeȅrX l˵ T[ C`zB "a!#ѴFѴ_l!UPo&TN.tL! ?u@92.=ŔNa<< 9s.l`)bF2޴F2޴e\093 Nn@ OC,D2֟g'G/u˘n,BO~q .Ut~k|.ڒjRnjnc %KQ-u" 5瘘Sx y;x7l'zDwA$Iچ̂1\1 ʹ%fWauz{CxG8 e?w;{=~L2)}>z_ݽ/9~wްr_G{r4Quu a{upbx}=^~w^\5׭gb^pS;zu4WŕThL[t*^e~!|fP,΄KŽ.??_kUϯۿ#wӎO摒G)E>|>~g݋텞:~|baOa]\8>py&Z p{nU@}"N<ꇒAǤA7Bq\x>(|wYn4E^z}0/}0 q D^9Ng͹tO]s==W{oEݏ`A ~4g_ũחg?aD쟽Gdx"TntЙhcxg:_bG4, tOẼ;t] ٱv/>:sm|' _њ3bZq/D uGױq3< /'b҉w+:2S=^+εGg}!EC>| l0ouh+|omBn-A A;_`X7Cm#kp46XU0oZ MkL.8.=}Ӛw|!a|֟/pb]Pb QcPFV|+ׂ2UZr\fQmN-KN-)-ybM.pn3 L;Dpb'X9rŔc WMj6@:xyIo1Q݉ϝsTw9jsFn=56.x%T5sC.aTi± UjݴFѴPń/?1$] ? 6q8C # /%JIJ#$scȁ댱IΞMrϓ'9{'9{= "ҋM:H/+#e{ b3SQ$ruʧcft1PwN wiɱ᭄!XbIs. erR0 wUϯװf6` y/,pY.`/7J`gd9GzCs,ɉFZW1RX {%Q) {%앰 b݃1GqOtB`KG`}3{ҮIFr~_^U81r˝/$\I_X-DN*vbNډV;JwVHo`01{5VR*f@\x0YF|E 9RRM PNٴ[6&ӊ ;W, )0x}?@X^Zo#v76RS3j9eZVy:]%mnH@XC@f4OLfle#QQD61$Ⱥ,)߆,aVtrΦ0њ`b6jQeG{Pj]$Gj8*kA)8*k9ohd&Me0(4"1h-JYQR)0Psk)JVW͵M҂FR1B /yS (FYnn8+g-⎥Dz%'}bW3;(YrT.Q=6OI &ZLMsTRTt<&"| IfE,%BTTĆ䩜Sg] O%quIGş%p5J+jX@L(\+@5Na\We7XnHӚ3f<͸ZSX3^U#JNr7/4_hl!Dg-%bVrV֖P]bH҈[M=[:Y:.7x/܍ ֳ̅]j\v&k`X 5zwӑMڐkKeMV '?a8 6;c dc9o5.Xf)E&"MH7iW%"MHS!,D/_-EzsJ.i$3 VI%;U1,JKk/%͉,/3OTJ|CбaStDXqKV<DN%;F5" t%@ςhpO={>S'ܓpO= $x}2C.;gbJ3˘*NL9C+}؎1>s0T, Q.b̥+i ?d s׊۝0DMH)]ʻ+k ,W¬ C1RB-ñdFbQI$ũU'ErX$ErX$9,)6cQ=uGx,+wSrJͿySob3j=#:-JFvNAGNRk%S:-6"/UY&ZeIgϧ/"&OB? <"]ybmk}`}}S L!JBXL*[9\> ZzU;KN=".$F=ʢb+ %Tzw3rV Y 4!XXy"y-6k/$EZ$EZ$y-M`G*90RbMr#8c2,FRC n:&=}|BU>?b?Vt`)r}'s`^;΃r"fP(m( Vk4V;ŭtT1Aq F,]&]qM@wI&n &x@2s^\uE3aRJYFcҁ=b4QA̕ )>̝ >TA;re?qWZU,}IF#EoBr$vs#UH9Պid8NRcp0Y."Fq$ & n >"ĝPf6ƴF>g.cqQ"3J)˘1.Hgq3&GaZ#T?};oqP~g# . ;3z|?+s4M^|/%#Ōqy5I{V|_]!ϷqgKN+eAʯ)$R/N.wׯD*R.4ݎ>Ϝa=䷣WР`6K36*TlwW/9oa65ϮM ?X@՗ "cc+^>Ôm`h)@ޅ))Ұ̕9&^C:@7 R~6ι;;szӞ1SeRJ:ۤC롩DCQY5"HvfO;W s lvuR\-)SՆ3lEȬWR{JÔu91$@v쬡q ](rGR5d^ G%I8e=HʖqLX rDzrǖ2d0INzJ*@'aH2} GW*Fr:I;ce2}lVV,hCvұ\0)j%N:6QfJ9 u 8~N‚r$nF{&ƥAUa)T~uX20qqK6Qx$3 A-I`M,SQHQQI:Rp[f7ZrZrAɂD~(ő%Ip#{&rJʙ#q>r&~?iwpP- XJt6FڰxU`iG5oxp1تৡCSP68I+hl>(lT[g%lXeSVOƥ@\6bN,d`kwpPlH`"eJA!|K3C.xڛ e0=LԇpA]8|РyxJ2l]#=R6).ւl}H6آ[1FҪk156kզO$Vh琧 0=E?|w p_\h7*%Ǹ ufGfMHEɲrK~ r@)*ɥ 11ؔPStR.1nrU3}dG nk8r8jNJAC@t4ӶA Lk%~ŵ.' س0?B#!IHoѷ*X5^6|xp3к-;^0BH8F< H4A8 aɼ6[!r閊rqhh lHvIxь&6kH=F\8H86^4gHeuf+S-ƋF#Ҹ3(]r2+Q4ˡJ.iUI0T-۹ci`5&6h] -ri/Pq6OFϑ*i8A2a$pQU(k.&H\"Ls^ÝSb8n֤YZ4{5N-y%M9,_}{ך&-xb{iK9-.b>/rɿ~;}Xw#(*]r6u+r#c-v/wi3\J[˫kc'zb}Mi 44g2Q'~Wowr-mkmz֚D/ϫII_kދyD/T7$͋ߍg^W.Ϯ-#n;ʲy+~/73On^ە rmD` L vb^,.sצlXE\95_T,,厒JX)noCtMryh3;,@oa-w5ȳO&yM&7g~3OC|qLҭVdlbO lENI_vgDJaL o*L95q fB(պ6THgxW߳6w_/B7رWX*jgt2#b5h$GUarףFll:YM&9lH뒴JB'JeE6|Xz3Q131jM.L, qNJ}K >R#^[I^r&H`X;1 w {d~(96xy?  c?}^C2VQ7Kތ$ra™ g&Ip0LxErt.g#74h̚=iMCL5h1tM]Eۥ?>FGn]6|Tb~ȓLb:i![[KUJB ifP -S|v;o#$Gҝ#IKfܢ3p ]gxj?G0n~GM80 H9L~z(:}`HxSX6j817p<䫫GL~Uw^~f~_x"rL{^; b"EYLXe6xl[e @lO)k_8oDitjL6So|;@}sHbB!5 { :XVYKdi>8(Q2jy\< 4P2V8J CI[htAyAS@*u qtqX`rE:9Cs P<QQJhàdN78@U=QF4]%WҋH(f5B~7"B2.Z;6IG Q3lMc# 7UJz袣#@VU c)BI%f@* O+*&bxFFD:0T>%B3Ngïy4R .-+~cSDWһYnB9=EXGE19@uU4ĘCd~pjځUGS'/[mZB3=_.6ʨy#8F k̑{Ɍ_^r FT`tWMI%arZУS̤YzN]0IĮw;8(5gٻFnlW~`hs9a`MKnw˖[R $[%rHA2^d֩#r4-4I~WnAQɘ[20np- J<XT]%\Q9f:Q+让.Nœ/%AItPC0gsԶqmIɫgy7ߗT$& (Chb2J&UČ>1XjQl+g!pb>[:} !GQ궆_$NVªYG~{/BSzFp Ey{Kh۪Kny7 RLP-bͯ.vje)I ۬'yW%6 -|*a^R70HH 1b cz1W9ϕ7NY &"GMH((Sf 7 XZ]n6Ne-16c'L冽Uolr[;?ofX8$Qלhst"b ER!,Rۋx)#A>}~??(SF }pZnGR"nY7wZ1{W/(@ XC `T`S,W8nED \qDTh&<9?UzRG$M檪Nۻ&VL:r&}XG/?2H'qO:O?7"3yz[8?ٽƜf?+'{ݵĿa 3퓫60#e&wwϙYsKۘ]+uyw=P{R|AJ [W竛k4qJ\uh[]/|km%,d{ H޶C1!&U4M%ZC_-13%Z>%mRג Oȟ:o>u=0BP"!p"F$"&=#rD,tGv~{feԑSAd"J"'b*1rf{-i"TD48ҹ0b[ȧ4B@!ee4Z)\u'G C(1=Ɖ=o*#K5 ղw½wN` ܦj8g@5]†m^ԄI$눓 7"<;iUVS7cg/!@2L(۝8xbdq#t.:%R>(qv3a#β}|3_<~\m_e7]!t<)DfH0K$EXj$ W7Ofvl}c/Un\F>/tA.&J:fHIRzQnvjۢv+fpijhZ ,q 88,V RT3.uR猚jQC9buMYЭ_l=SyLlgԨX f4ipT1͕ 3"tIcHQ$ z:`<#sˉwvO_sTEե!|14D ODԹF9bFF֡s${ хl6ˢXaxAQsD9" lN#$ a=Qh$.<3 )pEs EqOʼng(9KV`_j\:3+`EY\=ݓ^EۋI8Uh9}[;HNxc)1 tLI9R_>+FF#HK`2#R.D6dLBQTJHHLT*^:}Z1Ri:dWИ#}C(j*1nlB@q旣 JD_j2k*|CάESOqh;WAQ?2G6ضl<Ё:1@LRQgݭZݭ=t<"m`bb sZ4ιTrǜʘƚHc%!s^ kWOcN) $m@[Dq|l6#K4B+u rmX9U6@l1>iV·Pj,U3"l[[(npJUj"*`Xn'X_ a]c<: ;a F34H;)%L)*$4T':ύ>T[8 _za[B,u%-R2HIamђrw8?H`:wy9b6k9fҿ)RtWNE#X?wP2['OQ>Xa{5[C z co>2VTUR=/g)8<ۼ{#.Og*F#R+%BLVɬ:|pɆzBw6L'V{jp%AC'SwCEQ5FeP LW@'PE<"TrbҊp\ _]>n^b~]ff]'Ca:Lɢ,[ E:&[ y`w[Z wifCE(2yg~HTzW1^$2݆8@1< ' u&5 <=s {FW=I߅9`,OM7XtrcM0ᬃ,jJ˯B,\#^2 1C;cw{40f )-DzB"ddKCK83LD9;#R!\om['1c̃f8^O ^p8.QlU嬘ן}m}n}n7~|wsXd["*V*W=禪?o\q7lwWm^_ 4+܎E!>z.]Ћ)$1DX\"6P"rs4g:(KD3JrHƌ-xHB`5 K&A@pYql#f:\.9 9:CrF]/j!P*Hr2*NQhYz)e?cE8E8I3,K>RD!cT6wwGf$yU?**8&設k3Kk3C!Tjpg1̯cukE坏 ;$N񸄖@p!FAP U'OP*XRqc@c܁9x*ЇP鋓C twFpw\[RY ?:{IuWMrT.̉H &@BX_oK L(0VSInTF5Oar^lB4]5,Nb!7""$_~j%9P>xqyn3iw$mM\t(h^ojB u^/H9!("B oyN+b3|37} [cNl!vYx'n\NcEi;v /b|h+.p"1^c#!Gzн۾?:42eAR{0̽U~GK: IЃI#68g .>SH3ZY9 V/0ш I4G~rh=L:9cB#6NJ_5+Q>\*toհ#.Q0da2rH!g|^urDa%tu%c)GNГ,+|a7B!S$}vdJ@ ,BRQL2l ?M%Ro.2&RLm &6Z I[[@JX-g~*2\{J SyN!EF"8A%{ZA1p /נTuvB7/~s>9a_B*Jq?3n?c4H{ 4;AZYLaB>%f{4q?S Κ4ZSNkb-.6'Zw%% lo=̗̬.Y\c5Bd%t&i+Ë3Ifse(\-?}s = (ǾJz殍y@F1ʢ,CeE,#CBNq3qRwK }yo%,yO抒k dy}3+McD-0Zoi+Ҽ[Mbs `@KilXOY)X71ՂD)G8 i#Q24p2KXRe1Đ; {h*5a-2B!2L-6Tqj@B">,VɃ]Ä e_"xO җ6,+q5's3"NiXe39WwH,G.Ӂi_vrf#MH9=#BGZM(av(LnRU kOHOM&P]J&h)L$ WtY kcHZBne/œM̎K4KCyZ\:\{'F΍NXEBBmI#8v倢vma[xcn.d;7JdN-6(=3Ԝu7C(iwH-^b"WKtY5m9(Xrˁ9[%Rc88o2f?eFax-u\X;_HpYo|y'$~Mŷ& ~!B|(…\v}m3ض8'$Qio W*no Wnj6,학<=#&m2nfEZ4Jkz,9"xi]>()Ugip:u􈆯{}po5 +f#; 5Sdn'0XJj\Vj"VTRFiԃ@ۻ"gr~qcJXRǃQF {X^ [i"n@_ vTb{ a-7F$dH6aVr~1lm[XZlBptJZ|cNBR H/ڨF"HkïWnֻpdcVN' Vrd9Ym!~Ɣ5JwsMˀ_;twS֋'.ƜB\geqف뫹n430S)ƺ~ãp}G'd{i9153N+?xn.rN@a c|Yo`= |8<:hw<x9 39󖃉d^ۤ7%&tzhFʽ̛H\?韁!t֌OGͿ2љ}ldYjF/ptGa'<\Iq]a?020p:L& C>S8)Umng4x?}o?zzq3\9tRvG$ٴV?$;嗠~Sz>ΧI!fEvcy?MJ Rfx9ɵ￾xoyh|l[BPmXG S:+)ۂ6w #˶m$AVܞl;8m EN%Wh^<t"a)"INuGrg~~ '_;sp|>:5sVXf)p1M+X/ux3Yg|u%e :k:gI<2Stmy/2N/9|Z^4_Xh EOOm.NswEg ^hΧ0e'wٛ0<.Zr-Nqq>F!=@ דk>ZH|b&ީ|v؃/óK4g/;^ 4{Go/!$8AE~%+9\ng#} ; Ei쀢 ξyT 囋Ah.z֧!xe Oo*x0x<<SOƠf![~ ‰7;9ܧr ]&\&_y^y2UAl]MK(u+}$_ ^NtT >Z>mB ʧDj9h iLXEMt $J/lg謡ʃ-Bpg1 lxM0 ^`wyl';.V0}MyߦYUY])ƻ8.ncU[Bs·!F)2Wm TdΔ:Mٌ]ÏOMP#%|0Vf"xj hp-oa#4'MJpN? *2Lb" !6xlU4{kIDFn ʨ谊G^ +QRhpLMY>ubk`nEI*5b ϡ' q^gcJ+}Hڄ}@,16^H! e&3/J,PQcS:-2$0C آLH#$1`H9ґZf@t hՀ#Wg;;:,26.P%ȮNzjXzPt%IX4*cZ̴c[ .؂<@p"1` ,qATY8lE֣k R;b>~Prz*AvVW sʄ䫓,|zg#EC,摴X 03J;k"Sb0b124'RVYU! r"q_rp[Zmo`x,MW Zi,7Pi+\xm\fD+ aZ )RNgw2*p6[Fؑ@ v\KL` Yq`|,B݊j>$Z-l+m1J[̶[̖NQSD< ( $0VA8 z@1gce[Q-[]ZrzoE4mC)#_6-Ei20@/9{>N&i'Y]<}?^~2E35'iw?j7 ~ Y7me~7Blp> WIvy>:_J!CjK AWcm?#l-TV%X<=~AF5 k5JMil7+3<ݻ-dbdJa= J\[0T`+Z6UF`q!тGl;q'UeWeԵ=תT'Z@B V߆Kfșfk|dWlȳRVFpҗ\-c y2JW^}[{J/z- y^ 9 3|ܸ%*19 O*hjiDe[xVA{~KtoQ^-%aĶLITKWӴo'tT1UZ!4R[ؘ. &%/^so\75PYF%i-rou,5)-o}6:'Ň6<8Ĥ1A=.6~W F 7Fk3P||ʛFX!e dE%SqYgoUִe^[Wr,[ d#PD2t.cL*8W(0E4&PA@ N I-k 4 `jdTBr-R+4e˅oD l<ϐDЙe`8JSʩf r;E1찶1m UX1 A[W28]ȲZIEWL MY{g%wV{WB&Ab/%k!V^Z5tuRV'eiuRV'eIUXZk;ͯ.2mu:<"Qj Z%#iiO(Wra֒슯%tyCo@ k2hHwTi/ 3'Dpcqr$:$8:P Ii 1(*DP&ki I\ߍgTO>+qhh?9B0tQ! :( V #BF8#s iF{7@hnX_fIpFU[7w47folL [ 8#4C%*m!b0BKC/z,XlD0b[%vu:'7"lV.Z!1Y҂WELF?[NZJ8yjZQ:}0`gzytzb$H35ilp-}ڳADdfEx^#vD޳6rW}9\U^`?`؈!6~_P hI쩮wWۋ/-6Vt<~yt?T&gK7}cy9cKg1[+?__Re[m[ Y[O匞V2?v]-U^pR%lYy<~/{V54^0`6W !PZ~C(-JfscƮ#<<9VE}s\M"~]⃥Hgzz觳.+,*~gC6ߞ|6lNl6}BcjC<6)&eؤ 4-R%-2y,!}ep(rWG #cv}P {6ޝ?_.jGz {.lOk$`OTz=e_0~\u*xG5XSm9jU4i}?6ˇ+iLc]~Vӧ5I#qS#"ϪO΋GՖ3y~ots?}:4|v>~@:NF[IH C^H6qfY 8JG%ܹR?]f ˚+MYSM ~ȃ:\YP.̘A5pc*RG29L0mDm@"mr@qfhlݴ!%AF^†*4^D nP5- ;o2-c \>FЮ/&lm[LVa5UfÎ̆d7fC83.`K\qTo5ztZ>h-JpfovBg3IMN8^uH5glϙg?-]gQ @5"q:6":vۈPc#RϮ^#ӎzB78:p'r4IOŻyFE}-!@s,:(ѽP (蝗B{<8lēG)!]|}ֿc3jC)ZXi 2 4})`K㢆l-SKa@Q$#]&R6 t&=[eJ]MPf: j:MWYaGN=&-a0qvJS~)2s*RL[bL'x yx0T8no{]}(`t:s0~^',7gj `z8}W~TP;DOj9JWOw-'͟c"8^|.  u>^ bA4h? vj4#o~kR} 5Qo8v¡Q`=r6wc7>+2`EO5Wd~SSd}Lz2jه3E`sjSw^%/Q"C]e|.a݌+uD}vd*;"l%pLb>ELez yAoWMvdox搬i)"ieFҗJgKZl^h|agktwqsT~6tP&]iti+eƭpsQTad\%)V_uzףeslxJK~8B㎔ŋ)9u *ExRς U%3ofy4psG7VO۟㠴_YX/>cyDzx!*z1MZh lI,g)y+ǖG{OZ$/i <#y,:޽_n²cnCqЅu>t{ŐEho6x].8䍳6;41-P5rmwjM7#RUTzl&QƕjZYZU1dϻjݖE*43(c6靘_jD6ZE!(mr(X{s4tif@o<v}E CJ6$&q W{A[Lҥ,FIy\5HӹRnHP՞S7n<76<Շyq >X'ҬuT8]b= NXxORILJE˽9~P͖%:/@4mwޅ;S︻}!2af|[`;=9JWǝiQDcƬBc"G#֜E#vċv!~TJ*xW?kv$ܓn)+{o hww.ٟ80pRMsw:J`o2i]ZJq7ɕZyeQp8?}>\^G(aN]*?,=hvL塷^ 8#zE{(*5R %w73}H=Z^@U!p!ۃor,Et +PYyo=o߬bQ,Fc1Z6;OΎI{O/}J *|Kqy#wӉBqڄF!±Dߒ'+^t0HoH{#AU]谷^ޚ*+ޔ4Җ#ɡI)d3C{SgO;ſd)滾H Ÿ{#bl@;$y$V,T Nq EV(+mYe0ELwqh6זgMJwY"( Uzf\/4S<8ZNzi`s{!׈;$SUbUQOoe3.+9n +WǠ9iY,k4m?y#rnxP>xf>atBK3#7/Mɱa4e}J6O sl#ڨ.|(I"ǷOYʄT~R'~R'b sA(h?R%ΪS@QA22+&53֏Rcqt{-  }Ѣ_`b_?k7;O.w[J3BҢ3"cɎ$*_+BZ? Q|Y-BT JÂ\;։S4ZXy y>O]}?vD"eGbRH|+fV\`\)vS ϑԯL*վV`=LYzy6z Er-)Yr4`1ٜ-ʏga:P 9ib*ȸtfT>h>o3>Q# }st/ճ_γżo1E;e !I)djam.UB>2>tƮ`^t`gFД pma05{?M}=4:߇2dgz}JE"&/{ڽsN:YRQU\T{CJ  &gޢaA|euVIi7$vBOӮ.ʳY)N˪VС}]1!?Bu׼Tk EZ3[*iwʺNtaNJ y,ET(k1.^'Nji(oopq1|qsA 0E AX_/&hc<}=0ݟoR~c>ܢ"U~@[T~ķtqsl3k1֢-naPIA3nQA)X5V͕7ԔI$\Ր(J_CѺ3$1*dr?]{dp ~XI69Ǎ AɋUZ65 qOr^[𴌿wP>d+"}5 #p7+~ƭO+xs5ít3 M~b4z Rߧe/uLRVWO+A6i]OAĸ*XNof9tNm5}6UbǤYK3=̾१{ )(2h<{fAx[~wrCxZecg#CvYG)пz`Kyݞ^);^5 TnJHW=ٝC' ?[8oC4k9~) Wu9 k%zkt3+`]{}V*aǖWG !}Eɻ>#Iko )坷>:_Bwk @Zмnݹe%fw-L?1gm; c٭X0I ʹٖ<-9$ָdI1ո]H8J8J8JԸ)QډA+37VJƭWT2<ؒkxwNeո͝|oe\QXSUc-'*&| 69.Ń#Lچ=w.].2u{5+ESػpWW-h/en+Ы9%I VFZRWƕ:TT`T^+"8UZ!JYCI+IhZhAB[Eb&k*.B^;AU5%NxiL4BkPi=[-64"K]ǛRVh쮁b[zTj[ZN{M #PBs%bpZMDBejKAƣB Zd,wӠ8TK(c Q^?` mvH)i4NrK~AuH"~˙HnC8V)Jj4b0fيL-V~PO_ymMgW"; *pYk8!QK QYSda"1Rb-1J6:K4Ocpm%el@<7UF2%h83<>վ+eebr).ЦS+-rFEh0P] FFZN(~%lbA{bt@E6NNb#)/Fa-mO'ml& {ITYN*)j/=V8%M$TWяT)`;Z[Kx b{AL.<zQ@2XLz8𜌎 &l[_U~ _gY!o%(Z(>?隭v2DÜ`h+Al-|ӸݯRÀ䯳yv(Vl(Ўbt/̮qUt)=^j>]1oʠPE^t*W}X~8â*-$ i<&,>:.ކOq->L5Ů3dR4Jҽ?\!˒G4 -PVYaTfY,Z&+g2mdhqz2=~Ľ-s|,Tkȷ=lW4%r -{}};o0_0az0Zҥ6["s+Ħ$dz'@nmKCO] sM& Ξ[͡yܯAwsCSstfǻ2 ʈ1*pB{6Zh@FǮ hl`~7Sarv1PQEo ad1KQE&, HP˦G!1"}kgaڥf+(,YlQKQ*wZ œ2FӃŌ,biXwo'͛UO6Ĭ8\ps84&4mS^BmN ЁX*Qoc}FT;L{;2ij7~hu|glgA.! O a)U$ JcCEMpZJIA-R}F+;.+N&%Dmd.]L?~`dBho#UZ5x|.y?эc8эcݸF׬_90"pβ UR,q)cԇB*-}ŁĖ٩s'ʪ;ނ Py 4#QW Q_ϻ8f㸛n6nfͅXz%C^um.ߎˎ2sɻ67cF[/8nTrIT^4&QzsWWu$|۪~yY( (c~q}PM6h!IQr(6r&kE=sX˽>n(Ӈ*Gu;哝^޸ԗ1f7Yb_^ b8Ltq:*>qS TJS{ FI$\.XCrUr35}j~؇evTNQD#S^|sn>^+QxAlpY]_~| WfjFLsSDw õ)Wj[Q*FLɫ+ %+,㞰P83_UT3BPD:PB0]1`Qr3tw$֬iսb96" ]1+9[*aJlp @H)-E(E1(%T@ѵ>g%u-pnV|q 6@jȒxeYjFP'޶,9G 0 ]Wa@RNb!EE uAE*j|\b5% ;WHLb,D.gvhZ43L{L8}}"@ :MZsy <;uμaQ0pA3nìtqY%NJ.\6S0;1AN&s<n@tsR@<7oDl(u#Eg}#5Ϛ~1:}Ek?5" "7p@waOcOwQ弿\oyӼxY5&vUʗUR^>^GQlErkd h^kO>LÃ3bܮ|~Y*^씗Z+I鑚X@Eh+[HQMQ-ݘnz}UuӋQj6azw3*ߙ 0Pߙ fi,o/ϑ8ב$osIj*WwE td.V^IK^t.4d%ظtSm!G9wE@Cdv8x"hA|wnr]o''ѝͱuDϩg_nsʛ ) РHgo J1#PUo|o%d\_vi?~/PYvjLqI#_s8;| v;;_Px.+Jv,۲M,wnKLKŧbU6p&Un"crrˇL:b-Р(&%#cU~}}`mxF(!#̞=kN(aGFjQf=̮Y'7z^xϙc޸|<2 p(~:Wg$Jdjzx#,v=Q {T'Rb*G(<9"Tr,ާVอqq@%؅ԗ,6#/$%/.jH7QVTJcY! 1i8u գ1KcV/6^LhžXwqrxD%J$oZ/[ ⒶzqzpBQ N= # +;K1 d$:"8ǥq aRe1wC1Vv#QyYCx@" jf @/۫bHHVU4#I4( P"_D>p16$WMNuDz*)5WPXIH$lI( LP 3?tna):c4T]Vޑ>?ȹNR?Wz;+9QXq-2skD4 !$a62Xiuˆ+b:s Q3OMnkb`}\G ܟ;[x`sXN+M]W+$_)զ^T,[D\~zK!J7pݚهM ?,,g/];YrKFL&$P0|z|OpA'0/02 16 p|t,'GPi PCso I*mcwrW0vՂXg'кQRWOf660+.r鎈>g2O?`5z8)SILg_\piͤl&(@7<ry^cN>GwA"^C!'Iקy+t}="9{Dz Nrm"2]Q9עfucݱ^S~Z,o2D:da]: Z7X3,<,wS_JDtHM;T ֒ITlrKAi<4"tDZDL"Blȋ/{q o\њkAr5U>E @Ե[D#aQ}߉wQ>s9Z|t|9'(J8Z|tr2NQ1&&R=a>,A Jy5%跈NBs*e h*%,*57U*a*u KFWTׄDKpn֖h27[ r_(0N*B2p~ΥE}l, 46^֤M1J㘉1%NNN;AYpV𐳲Lӭ+=$NڏуkkyǷ|T1Ƕ"0aN`AU4%5WG4yMpiay2e8 "pMhWɃn} ,oRC+(#U B5eHp$KVT"#E9m<%b^bJi[ %2$5D`XphddK2$Xu%yReJX~9* 0}#}#~0/i b]t/)W DMM;q$Qly@.4Ql$bJL #tA JېمL!զC >PZvS}Tg"t$oAGn|ǶMU1Kd* GT3shLpjt:[@SC ׈%4ByCB&0Oij3> AMz*wl=ڔ4,Q*dLXHr QbBJc(R" p+X?9MN0,1t[Ϡ'Nx9L/Kċ%N_~s;sX^R$89X+c:ࠀ13 "K"kRE$a OC_T>[=%0[)XJnjXɡ0*@r{'[b)?<$kJh/ƙPBRSƘ, 1N{N;i:Ė-kأ'j,U>o6j4]4ZEW$8EnpfV )!V!J0,*|Tho)31ū=6&܍tڻw'ɎR.+2wǴ7I {(5j9mB*rE 8#]TTmSq;~6UIV =nSQuZwۦg|.,Ӧ(hKB m*إMEVgI>8%*ww }G*SOhW;+{Dhy&\̨Db*b $ձ c8&("a [}Q{?[A=E`QQOX>PvA6"iv&DS#bncYL*HP%V9Êt~w;t&]jRNFSJպ^7Gѝo+dtQ"{n7deF8g)g[Ŀv.U!SޟQJ㵩D%#g29uq"Ov=|rS,wnLl]RDJ [G6J6(/,(JEi.hjv★Go͞ӛȠ4,4Q(F"*q/ʼnq7(E(U GJvS6[H<'kXe梐&rsX E8)i FB4>x!Y-{(ػ\pꋜTkԜr jh7gz-jO@Ğ{ctOk->Γ|̆__}\  P@ ;᝿ JMٝ[v f5sw^*Ug9ao4{a#-Uw zcp_de>wS&ٟ9sn<ͤg[<{sv^|>x˦ǝsiS0IhpOΛ '9G vlHWtVhOܩ*820¡,K@5*x6ZhE CJULLs_ O:[ +N7x~,'X  XRfbJ]\DÏAZB0I$&K''6 Cc@!oBa?٧}u_jU\ꄦ&ԥM2ʤ4X SP. gc#QXXP!K| d E$5jl ܏dJTY8ϴ4)&.d݇gGVhfW84:M?dYؕ?̓ ]5UVf_>%g?}:M=*^?[mu6A_d*'\ғIZ\R_kHB޹TP_5ƕ脾vBEHQhGG|YG6e;΋1 b9|&"ߟlp( v2 tUF+~vGe,fҭQAR4:藵?GGNȰ_g#}8S -g>PF`Tʹf ySHZRKPP[QFT NƐyQ +Q* .a!yTVvn=j!e(o}]mé^OQװю̿=ܥ,gQh&v>@xUC5K̵-w:X߹XRgD}5*o Srۏ6jw[f!i'/h8who[Cs1DY!G.8/4ýH\!mNÒ˅3Vj$U-ثTz0w>[^w"$(g-(:IV!,,HA,2y]Oq? AHx0.Ym:dE b+ l!h"ʆO B(mI/N:1vZ 3벖{I,(xF& dk.#!drJ:P3@ԐM 6 0rmCx}BXo^ӂӾ&;|8)d6v* %@Y 񔱣 U=c]2_qdW+ZeĭZym`l#FlﰵbuOcV7R$!$ .eْuBRdc)'ǛbxN$8Kr( ^8w2LE#mV:7o`nBko١~8jƩ\nmm 6a>tkkgtSx&ц!.kщ"r 2_K̆AK A&MSV142v /4x8Q(MZ;(rxeX:uçkukFn-E^l(b47$4Iho*'\rhmX&g=b!xZҬ8Rn6˛Z7lCm+zu_ Kl<Uh Z1a޺U)lY mp!X4Wxj4O^%gyj,:ቼ'ZN%^n[u7o)KЭ`||yn{|3=hd/ zҸ)rՆ ]ZF 2 CvtK2M@?@v̗KxRݙϵ}.W[i\wrm0V:a?.js?YS^yKS r_ሉ_ma;qҐ;=͂Y?,?AXYFԢPS}bvO/3YkM48(VeT>Rf0()Aӛlc4g+ G=c'ks]d[e)1N}}bvf}}|R }b>29kqfdEqzVⓧ6rvkˣn so5gPmS4pcuI;cNxzf=7=g2RcVCA:= .x@ "m0;E$DXSF\JڈRNȳXsPBH %`&TBfn!\JɥhTq%-8ZTg"k'ɱ*V_^IkOU#w4k/ʹj)՗V9y2˟]7bI?8T?VG` %xJ 4D2UҘr{^ad4,>^R ??zq}~ =v5Eϯ~+ٙe#ktoc)W?畗u4k6.~Cq*4PTKvZ@HTAB:wP"mƱl =V~ۂ>9߇H ~=8BBs""N .>ʁщpR~ ̹sP-!eU?2he"tp6Y/Cl [U8$V:eB^L_$ĮQAؗ/Ok$-"m6}~toVoW?z^鞅Fk=䯟yݿY}Sgɓ sV`cYʅeNۑ* ?_WStؠYhg'ڵAe>y\V$x*:s'* ?~Py9yCs0$84}-I hgaah;;o 'p%&M[s T0vb~ī6e̓Ǫv1}Zr#W xm-D[5rE-@*ry\{1if:-=gF.u^meW1t0)3V=ExՄBg*vAZ=$6;cWy}t]Aʫ/*:˼(EaOq+hNp+0lOnQ@}Qq|+9'Z_G~mslytMR1 N(X#FbuR q$OhJXc 0!{& '܏(ѭ~q}\y_65_#D6$H:!?8#8䉱#{W3H>m̴7Cg7hrNĞ:7T&3Ѳ.ѣNeb!) 1#LYd Z֌y=S)Ueg {gDJ* +uw HiS/ @))qn4Q5"e/'0!Hk"ڥuQB(uVֳYhL#3c R@ޙUgO2\xA(3Fe*uqhd65Jݖ(Zw@ Y%jFmR3"wWE$>:u}ǚܥa+ 'oXkS=PãdГí}~1̆w>QRuf>6\G!0ȝwp>* T>ί+ Ng l펙 %gkNǙ,P$kX3׻F|auK9ۂ{Eī+9syps5nE[e~88٢pD (r{.fXqc° k) )mm> r ;7FQB B@<fp^ҫeX7ɞؓ 4g^EϾa\$֤S f26y`ᅐsȒMV . 2ޭ:bt>nxs&ja7^BOGSbU 9B PH*;:py`ې?190S1jy=<ןRZ Q=3q"E 3QsϒQAoRR&!hYf='x{2 `)"E+$ 3$a.ݯƭi3@w缞cˣe]aMQ󻨛z7`E'X0,1h:q`\*>gn}FP) kijzj8]_$p͒QyI <`|{ Ig;Xk?ĸIcj;NqtIzDQ";+&[9F<)[GBv7U#5j?9| _"[u9VʄDLyi#Zs'Išr+rSl,;NJ+Ps |<{v}26(ӧ ,W_\C@Uץ7BJo\@Us]PN_|7W=~ם~^\%ϏZǵj4OWސRJ W\!̐ҊZa ~ .1SZltpsEX{,z& -%/Qs݃(ͩΦXnQwZ՝d㗥8Ame)FĂ*MҒbl8B 1U Z/U qDtp1o")@)H,YHJi#+n품`R $DU$PdP/,vx A'JL]DBvߜ2mL' ]+, lcR 8Z@\*dduhɦ@k f=Sne ցH&Od1,ĩHPN3%_0i,zKD]JFV'V2! <^ ,0ebQ$1F vTACHD$`l-з\&^_]wЛ"n3?/,ՑYʪAMO[ Tz-E)C"N0Gwp7$VI1 !uӑKṀfh?ǬdiTI1䲌8 Y5Ur9\LQn AQW2[´N3 (rAf/_´=3 3Ԛ[#yAfںs^d!?Iftںs ^C0J%1ПuwaTV(ibt('cW-pEpvM&Nt#WQkݸt:*8ꐻ ^nݼ}tMpԗWZ[r&$2,opfz8Sxb٧!!) ~o#SSٽۿsJyR'}_D]˹BQTvٽXb ކFd.sD!6) xɓZ82DZ5չ?!d+Cmw ݻ}TG`D"V,&IQ }0C? Λ{Ȱ׫[LP R}xr:PuȄyxI¤|"Lsj mLۤR/A%ޛBhZ8QPPF"X䙰?01c jr`Sy!(K-U%Lv&v>sJcV/T]p9mY!,2%!;': !U17T0QDq!Qqg}xfj:{w ާ]4NNp&)]ًS'>o0C=xM18.+X)j؃lvbȅTla=pQ=WqXzuRDi%Y`zWO?C?T1ZsE@gD G(PeeV݆uK1B]{ .)GBvd.e+Sk??cMIWϘ}7t>F ݝ*{_C9cEnŀ`q'i惗ǨĠŝٍ԰)GBW&[?/>/Vx9qjP(\$\T"xNfJԝS1ڹ^ eF\׶s* ) Ȉ˽%֝b`\JI~Lr5-嚦52t=(5RMAMMI)Nk0sQ9l^I.G70]̐U<\V{p2. jZ0VoiHKnr:'BVQ ɻQ4ʘ%jJMlɥ%JthI( 9´QnzΝۻUzU]' /TYzY#ċE-h\* B&xl)[& ;gxz P֕[%h B~3Ny $%G\a" )";Ƃ!Nrm}iIJatR% 6 q(U@&7M5פk6!3v<$6\x&' &n³K[ʝa)fOK^ H(%LaN'-2i4tyR"^Iӯ{w=nH aVF؛ V=~{{;ou3 yȺM]+]^`+bXzn,0a&E V _pЇU.!]ՇmKHnTB H,۫||9*Sz^Q8xX 3>KvAA5iV!Rx/j|#$8RORE">^UAP8;wA]#%6 #4ZmX{F 䱚rfU 1pcPQ)s( &< S I2Mb*r.dd :ehwc|,9];ȅ1%Ϲ|@@b+,$1K@:%2cqW ^Z(/Fg: mBzŢ^#LA!1 ) 1e0`3*yJd2T(fP)Α@`N&6ߝ匠_oj):}Ub՝5 [l|\ 30ad>_cX>6έ&qa?=b28 } >1Z{y%#'"(uUAɽ`IH]:'rcU Srs[bdn n=e̴^ /1vOϛ<)^۷{h#79E }H'xGA6m%nl4Mv`~{J^͆!&^wGʵ)*J%Sb9q 3ާlj 1:9Qv$ h$#y$BQ M elY#6#Bk{+֕Dy78E;n)ʸ*s#52BaOD23U& fRwaDW-<^T6-d$n{Aorf""_ū H)$5L'fnEL3NOK&+CV4Ѣ׋L֓x;z A[7fÐOӂv%8KwfjFqOZ _Mt:CY(T1jrv> gͯ."U Ew ?^_󺵞k*pg(c`bNi1TCF6#قW4ƇjX``If,l)e 側\4Fj`*`w9bLcA"&8"D(XJrU7.lJcvY<ٕ.+%8 k M[v o8bÚEd%lZQCx~焣@K\qF8jT<{~v3%&Y,:EFF뙋fɥĬūqKO9A1.V@_gd˜Ep. f>X/rۛ Y0(09ғAIe2z"\bR4IAhI팟=-aD; 55E.V&=Y`we,MA.C<ۯ!&b\d2(zd)Q3א2_XBlQx41XK0ISF<,"(4*hA2FCT:3#K#J91D0Mi6RNe/P+ ؂2>/w܁Z+i?5h͒j-Kp<|BhxO &v 2#XQ?ey ZV,k n߽> \N+U_AP" q]#.I7-&mp?oywxkgayOfՃ6~R8QB"RSQcy_(r)c)(%s?$A, dJ) sy)D 2X%c*~^~6sO58іg Uw_:N2Ր  HoTMG=ApI`(=O,D',վ/ujftBb2"``Q ;GqFVyn$%=hB3}I !i#bO A?0aX1S@CL#e#%z7JNÝ@W.iz0nEkƿ6R@_AE>zi9h~e;#P7[)@y]ؾN{X{ SĭVRͪ z޳JX^ngDǩWt)Z].L`i; X?CI}s Gct5aoM#dIxfP=;CT_]{{Ecuq[dz&p_3ur _ӑx~e`0=V"N`W{NRPoxmG"XM̩96{J/A TGn~&=<uogF'~m͈ *gƻw 0 ['\;yu !ҨKx;\-V0~ FuYoyH(u =l*h×.‚ITy 0os56Wg8ҹ$H|vMnϥRa=qrU+yfc W(&zRDA j7V{ ȊCg 7*UZ 'Kn-;N%+yVCLg@0F!|'yڥv[;3*Hg -C_ 2=U3lLe^^>y@nf]*WHK+!xw:|;ӱO32`o_ܞɶ=[։B˲] ,i|Ǚ=͐7ɧ̎ sy'}̏x25ARsi&rt/抈d4CÏ;t:4n~,U%Gd)CE) B/?{f'܋0T6UA1DR=/'e2""U& [inUG;n^Wfghoq]dqe:C9*l:ñ y8~hv>F ۄ?e͵]+sX{K5CpNa6=[O7.;_Mq%(ΧڳQkD쁗0MzF6=s{)=fޤ&{fd/uK )"hȍg*dZ$vH)L,A9YUr%o)g/UlR?wdO 쫶ig; &BU_{TRr *ɫoJiT.pHb|{U{ 7Hy{N@v ebѪ ɭC8 ,Wɫ?Sd#x}g.COk-u(%CP*/-.yeq ~3 ~ϳ_1λGCE|G #iSXy}eiiJ QhD[nI&JƬ4Zp^R-=*5ZM_[~SG;#MY˗ǀZ+l=ĵE%~8k=;,\bՕn4zSwe${@|U{pJvP9f1FC_9թAhѾ\BWPPm|3Aj՛2)i|e%@ dՙ$bÁy7_/OZsG'=WK0o Nǹ4_ =<"Ē`q)sJtZ%gH4ÿ@Gt;# *uwuC|HPE>6)QK-1i241lT^$ӄqy0F+ԭh('H O{#qADhp eLTRJ $JFW[i+\A9d6"BHۓO򆒧Q%k-v%nTɠw+b|AVBiJm'}Н4QesJXoϐk8Wn?X5χ#녰ڬ a^Z{ aAF[jp%nkmStwpiu^Zx/yYyb|KZwǥ+:a`ZIiU<3v0o'g/_UfԤfuZӗXЖN"c[F":4f%8J ;@kΚ _?xGDK z_Mitg}D9٧tQN'tQN'P.N>h뤡R1,h8%DH5$eDw@(,+?7s^ǔ+P$MGyы"9e)sE9eRvo$iu3xiE 6փk +~$"mx 5x䉃Nv^;,YV# >{=7 EΡn}s8At;;|Xz`:{:G?/IW'CF,|ΖLps~>?πwtj%NL&|&1ڲ AۃwApWw8>RFm2cxkƄ 4X^q&PjIqӸgr4DF1E #MI"&Wt))FGG1t7WoMcsy77oڤR@)W;ljU:_~; I.|95>WnK`0NW~u]ξ ĩǟyL.{<<^}{rTnc,(:/q:Yo|RQ6 Yv^gStrϣ)Z  5G dňW(Q6Ѣ G4$6.<<1&S*cDuF(L"&qe6kRg56)ᝋ"Wi0]| kTA΀ oj j䈈̣(kѫa*$:EI9; ֩;/ܗK=b|I g\}I %q|R#IK^5JStZ;_Kgt`#ۣzSDsD dQF`c3NO&3dp }vu;{@*5W7R:EtEoV2%kN 9z`Y*xfw#.bKPw>RGtMt:4 ۔,R|y?g)P]gMa~{٨xi"k$2=Ete6ߟG}q.~z KY켌g{|-~g]ܴ訩"ɁPɢD#G$F)I#6jm2pjdjkP]j8{28t<{:ò͒ YZiDUXAsb.ýOcQ_> 'Q/$!$ @k世^הp|쥊 n/{JY77kIebG-L^+8Ք^~ᙌ]~TdQk/tF d7TZ׾B)mw|SqYqAhjzrr.H59ʽNΟ"5%ե7]uFyՍWJy's)h @JZU(5 B8έ3~ZIÁ7i8KSI9$#X _H@Ԓ:8g$0u$:" ?z|0]D$o)`t%cA)9/CEht6jF\@C}EEru TтgG2Ae@40 *pmbb2D$ >JW.Eaռx~ӵqÌCzU Dѥ_[4v]c^|j'@!#lv6`x!f.gxR&"Q*8G8׌PbDk£P&FJRzCzZJB)駔BuZRmGAd@?{۶Ѭ9}ib\M&;-&xȒ'?.})@cZ,#H59 '$|=/ .Gwょ*6D;xpijsѳ`^0z Y]6z ~ Rc &A3qC1%I2;z!08A3R6-N˶}6rMSlj3y`ی+e;#@`{]q ۀ@ "\I3=rgacc7vZ#s|qR= y1`{V>'vuzd͆{];#;vBeB—/zs2D09's, c|܊>ޛ/6`-VgfPPkY Dı8I^ögBI GG~Ow-hdaayr 8ot2 c|:al18H FOa>Х|l^2'dNgb˪ c&` u<0O,zcRsVX?Ljw9A>xԊ%Kޱ Zq׹0,%׊`a.3Fs2V\X|Nf?DjN7F$g 8gQG2`dӗ?['gޜj.'d9⪨B(BGFdr( DSN0QJ.~pԭfK H^'3LHU;g]1OcL` JǢ-Z\.OKus_&RUJT\̀7r%nw<c,A#qa@p|Y=}:4]?FVn|ݹ & ΝgRDV5>ARwCll1צ с`< ЀOFs{ݻߎ_;Ͷ6_&Օ>~.=otώ>qL\p?~_nonߝ_6Y; :k53fnz̝۫/|quקu3aǦCl]Ϡэxݸo'n I؛~t=V+(A70c#ZǤqk_?AnNge ec{.%9Vc}gru# G| V7 zB|>'D5lgm 53PǾٳ ?.܎5~߳ϡgЛnF]$\n߮:^һA2%0[ٟ|a^L^ 3}=^w.ЖJԪZ[]sΧt_qEG(@;6|B2όj2ng<52Qg KT$)ۋu-'?~Ga6ǵޗF߆z?M?rc~8~wkgsUW>Q]:(٠kQ>sb|ֺѢ tmzZDJ<G]C[й#hc Zɾ|{O/o^x ǰgslwzwE{ޜ}<=Wxӟ//ߍ.|E4pqa)a8r86'UHz!sBJBL:9„KC0JBeL"ȀcX Z!Sc-ta7r#\rR%341IV8wNcJgTZwFekX͕;*p GJ&r9a8UXB S)8JNId0tǃY }ss9B^x}>tÄU'Tk\]cqAw.ނ$6wOZ5Om+L麂[`:}RY ϐ1P^㱊2RbL\c+*[+6swUƘkTl03XHK*P~M{r  7O84;Q[A6خNjZY 8]7wV"kC^4xm9jR~/S!2[ &.e֥3p19F%_RxeJgZtY~y*PQmN5X7-l   Mi-(fzMr2mb%y&@[CSD $V<[~ahrGa"Y뺵K(`D)SR( 6i$FֱvK7auvҥRx2р G P@BMPj!@3iDb..lr:nV)0%pL>j 7ĚEuڳ&LU]޺R ?nNB%$Nw]%m|}ݯ/ }ؚ ưvCxw>容2Ub7ig6Mh 1M}x? }nxMkܵLQzAdlc7&w:1+TkNzCɛ?u+ڴE?Lr׃noGT܈n:cPABvlN"R\綴̒ P(G$(WR`3AI6 a DDD.+ʗ6ikPydӽ=TY,,jCM%E`2$CZ4;E3E9 e (R eQENUa)014\I /77A5ں*\!C*Z[d;@a{۷}P"AIOBa{s(ԜR_4]GmsBaN:=Z5$VAjX-"v j^Kڊ ZK&ˤ4b-8rZ^|u^H6y j Ѝ] n %Jl oʲ.>~Enre8C}ZM;ntrpZi&Aѱc*oejm(ҁ%=QtA&6 N)MbBPWɡbUK~F/Bdĥ0m$L6 ۩ew:BbH&D.GľfA-oʦ57zţ>ZZ3B}KAh\nNp54>Vo[ꌎͶ`3U2~3oA_F}єwCt&Ƨ.[̏~ɵ>Iˤ"cx2Ay4iCm>]ƴ槫 }nZ\q3;ztˋ;vFt&S3QgU#"ШƄlY")*sVdռK\{A娹pǑo d^G.#+~\5kȥje^ףך1^3pO1E7 -S1ϐZ b2Zgoًo&R(g<-ꭵt-}u֖y )gJj_͐$R{GҢ[huXa;Vx8x>iVjKWnxrx |/v0x6Mt{i7><]韧>$X% FJe>pE)4AQA1#5VQe*\,U:Ԛ&]l~s{wcz?ʣ08ke*Od>Dp3~M~)4U*Ob.NBYۦӰE+cB! U2+yRJS[+ k +\9˙#(%-2cΰ oGiZQX*V`SxK3;5OT 4 bpzhDNs'k[(s +F=VN3-BQq6 ^l(R"(WJW-bHmbw=0y ǘc1ػ8_駟JŘ>pUezԊpE_Q` ­ްLF^gVxf(0P R]F4HƸ5dt;NB>lAPW&^pMO0ĉT?.dw+)AeiU^)+ٔ0KS%'f <:?o ̋B\-twO`Ϸ@ޤnS0k$1m K |2({yn'VqXpF뫻{^P:))v4q%yhCRz(V>piO}Zڍ6IebVfpU ܍I5$J%JLpL ;]XAU]BS5!s:VN-T=U\1|0rg:&#'\ʣq+CU 1Kt:%^PtJɵD;P4w"4Zo{B:] @3#$yZa`SD,nd6KfMvHiΐaUY$0Ѭ S lDHAGA_^El)Uc&J,XsrNؚ9&/($!X`]u98ft:7RbkERiOEW(SIQw`k.+b5Ou/ ǂtV}t+Q΀Qz$X?VPqƷŲ22ax'TWiަrm=S: 3g<'sJ[isAiF*sٳE=ɞmK*4iF$aȑ(#b 4:cvHV 2RM68ʐ푰sB cuC_&Jt e:5[R合 *m? j4-lZ'lQLP"݅VLHpr\Z"۸8uaҢ5١űPЗѢZN&aSByǟ*1K#5Fp{$ؼ:YռCfw/@f{ g/pH:<۾?N݊DUFQ@5Ӏ"QEe)Jxi T@XąsJmz3黲lJ /x9jB˦?"huC7Ic/W"`wEsq.OFǫ^:XWtPtowqM9?[]MqIjm>i:G{^zO2iԻV@??!=DzE斁ep+;awk Zўٮ#F1"k^G$m E mX]'̈́ )Tx^^;v]wKx[!1}r贕H ~7޿Fl`L[ɣ\ۓW0"O8yzw˅ku<cV ^5c}>5/B6."Ocٓ|ez_yB W3R4"OcTcհ:惦ʜuOȉJ^QЌil2s(#F '*E yBH @N>S Ӄ^K~}J+V( \mf({aFrY=GXZ &PA2ckر`b2\yCiYD"|X.# qU_ՆKpMsw,DJfkﳛ+ǕTZw#*)h=uz<=o$X(Z2>OB ^EgafUp@ 5:*s/E\j`DwWGk8%& |RM`ol3Fּcm9ȫVoh q5@13ԭ/졇OQ1x 0l͔,?izG(xbD[Qj`_Mά z_PdF~??9ï4s'[Qx^T,n{(M0FS 輒miIu"8BR h9\vDs_v}߿8}2#OX;t9GX?_;yOL຅L9)2(4w( 2/~Rx7&&2`=c.SӚZ%6@4eh19M |ڀ{+ P@I%\IȱjRQʹ 0v^ ʫI5ZDnmۢ#h77V}Mjh\kKYJ=5yJg3d4''1dHg0T [871Q;i`ML̂i`[Gp=!iN= 1μ1K1`!!ɕA"OV\`܀:тQl?<%|Dl,LFP[an-`\"ȁS6Fx4H?xj H( A*yT*O`W>,sHl'Hv"YډdHpԒp?ϣ1 ƣ1;~ \EAE:,]ɸ@30%arE7? u' vg%|`j ץq]Ie()WBtTJ#j4k{+ U&΂!" oA5A=f"*m t`tJ R{B_2a""g4p4)f{pv8؉C' 2>®kģ0~U4 Շf_”E#F9X::  ]>pcs&k7!`DZQIVCɮ2Us/arhC!qGӚOgz8_rC>d t;8[MRҐg8#fOW]Z;" \@ww~% E2훻_TS&V><.z59CLËʓhM~qn+&c^Ч|{]? ?_^B`m *P'+9iE7E]IpB 5bcQ~D.mٓ&9DO$;R+X#ߋvv%%y563DZ6p?!'<@E%*8B#uA.Dc3,=8|:cf`%1i Ux(2ː}HuFg=,3Zi,3$Pm´DΜ\!_]$)*g%\7S6g$zo Z.$M"K"ht DF+<\)9a4Mz6q?hʌL`sgh$9."ks$1Kǰrfh1m4blц8|(qN(ۍhtWrCX[Tczcb3&"XuDKwL:`wk<'ҫL#'C;f*@L%"bQ\9x4B?䰶2\̞eZlDe*!t2aHI2ethaƉ>،8ź6B 4/'#ԭ HTwdͤZ(PIDV%n*m( AeMfA.j/9micܐl&`O `0 RU;EťbmK f6(hm(,3gSSڠl݃lv^NKC,r>>ןY#[d&$u*bY{3ŝ"SKYb N;hw#fՀ6xj%ٍ/8aK=c/0ԴN3A)Kw_rYv4s>aeڞxyp mZ +}+v,vpLjG{DS>#M, \`Aی.bEhsAaț&\> GL`?<$" yDHH2dH?Zyx.J %$idX2οݾh{7gk'}A!=h;Vig` L՜s͍j=' b;>a+pӂq U[~)ЊckQ־9Q?eqoP0 vMౚ d{F) J~Sv pҡ\[.%(0 iB`9XFeZUBQ@c(-9seˣN"Yt2&0yt8o lx (=X *5=.r*'^AD-sWUWK#ھOimx\8*Ri*ťsi-bV5HL 5Z!,r6ap6%H9| ;Eі\jSQ'}P%MjɎ%r9jt+g"0fZ![*ښ(SHrXE@oaŎ!c:TNBs 8LԠ{K,%e$%NyIyW-٦*3:T|E.wJ4rYX3)"cQqSplKdGNwy=uYU{@ lLJF@$#)5fr6yeII"nk)B$gY :(/sG@Q$>G}T (H9:MbWjIH#}OFXOAV[K2eBI5lZO5jN5a8#~IHJUwI)eig(AL>*\f^'eԑ<:ήN{ڴfZ#Xo ׾* jE[h\mͱa+vgmŚIK.+M{ %U)`zQJD}Q'zA\58 e .X#$Hw{k&ENf!=C+O25Ix)LƔ\$֖;&9apV(=#ET1]` N6u M'uW8jѪcdV]X3UԦ!Nv8>"bw1BaQ$v~A*6Ol<o8oB 5<vvrfv { _yse}: *>Ӯ| g vvz򿗗t4qz.ԠO,?9J^I}-xVbZި=ruyrY5=MrO_}QFt{mL.k#`˵O?_`t8fuDFW> #v}c,ީ|ZKq89,_Ctrv=^Qxv,_`cp<R)Mr*HL +;aǑvz8Sio!RY- o vΈ \%##7Ȑe-YrvcͤieSpdKB|\YPvɕ$"W7ן?Y )i"v%qaE5nv9DP7tewnfs?04|9&ȗ0Yܵ:º 9p ur*ΫZE[ n'$c<ԒsBM_U]ҮP͊2oy-S~vݳ˳> u+vEX֎+7I^}=qovZkx ߱% z-+O*W{qRvű50|HVtl޴|ЏIq) DUV5?@xښdaz%-M@ ӈZڟlz-kZ.GųziZq!^!6o>~DR$o=EAcB܇6CåUts(97Wnp5Cъdf1y7S>`fI:),v͐ͱrfũLLǁ7Zq"q7kōZ/G3:$GՎ)T/9:0e3K~}&L+4F=0ϛI XIgv7Vg׵KfIj hgۼ ng_ VXG. *$GK81PA8F* Zqϥ0F"òV!lL:VQNЁuf]˯ uyTbC:;\^*gh) qPl&)B & dȺMvƙc0 GQRisR0~6{INjjHưR |q%Swyۿ[^)5}?/Z{(i?+y*`?+޾ ib״g]$z 19[+}?'~y$oWLgw)R aj{G_.wUm`pv'Ap i$mزW~EɖiI""GM鮮~jۣwM j0hųc+ BYC𼽟`!e\dB*^l< vKҐas F 1̸NV)'¥q&x:S% /@AB 4G5ϼH9j 2?4O7M)M&+jWʍ<@S {Imئ/ئkI@ RrrFGߖb4Ŵ.'4 {>..,t\c<}wq)^ݍG_V̋|ه)\%9Xc&srRcܦ[‘ RKO2Q4L.WR|719`D@~(maXH4.BѾ(>l4g4A pP@KE׋&lI[n*LqPd~GKP)ҲPOr]*^ &Jݪ$uҚ4%ۤU*ކ_UNOl59Q#t4VtiFb 3c1"B52:MRI"a̘„ra"ef9SA K-,mVMB]+ʂ=[VGB{IQm (w@sQRөXYk_zT`䞣_@B?ߏ* s"ۻkf)ٌwOS >w#RpIZjy=_rO kW9Ԃ[O'ion>7$`"E+s5#,$R2oNJif  7&׌iNKKjbc†y7eJ7Dh55ϭ0pf!1c7pS6 74 gYF q+bM8sk5 U֛&S9M~znZU9Noέ?N9'7K-~ot[ :=_t<}VZK5}ԩQ>Nܼ\>6qO[]i"d/%2fRk:G2 UZ_wqېcD3{uZޠûOBP-Ѝ(` 諫杨FAX e;)0Y P8sdӒ**\ xѡ@e`sy"ƈ=zZoOv~K E`0_GIK1{kvM^fjhZ*ш*$^qKжf5ruCt^swXL9# ɖp+ S\٧uQO\[DjQWITX Zl*MĖ\ɰbo$1SY5VBjλrp8B dYBuل2GY`;wqWڻyVf ] 4^T {M9YT  ao>?=M3 6de(hTBm)0!PChy.ֿaCnE~rUaH?%OT/$Bv8M(qH!$8;D[moaN)zִ~\eb޳ԳQ8ЍjlRQ[[ IS+Y8P.p!x !,%7J#gO]O(O!%h,0,86y0{DZ„B)g^ ޸@(n4Jxn"6J|_i珳뫫WͯQ)yƳ!W"&ٗg^\EWԟSP3[MYo=X9ֿ6'AKQ7 kxXp!ZkiY3AebքEIAi;H`'g7Mr5Zrh5PZ)q5y9.@CnhPˇ׏I$W4rK+9KaIM$6dɴ) ()ɔ: th*of4X/A=J:T8Kٻ6ziouv&K[o_]KZDwCoޡ~q3žCK1(p:#*vmPn:(j'[aPl7=$^2޲b5-)xMAubsǿ B$Q|,f QqcÄ6g9sҥV"κj$)D'K,}Q݉x4~l%%7L!3[ %kd0H2jillc{zrJE$(57ǖ&EKf(]&(ZLSscBogD#H|oAr7FIsQ^8:|e>\|E9SJD0u䴴Y LkT!ZiNXӶӰj z7͋Bt>lHhɷMN% TNCܡ|3 ]Kװ16WTksF;9ԧl+g[էRzHgsݓ[@;{0GEy? ߼JZ$lʽ˞!}!֞kbmc2`B[^n݄i˺5?|Y5_^.F[UCCvͻ2خyA~=Q ziS ")@E\*x`}٧9y֓e2\h,̆YlܾUGCI0`~?B `\hzep4GiĭYkSK+K:MUo޽R8uhPL)bߴȨ5BQbs[[Q^Vq!TL:BxPs.A;ZD! '犒 U=P?t`sOiAgDS}t ,*-^+kP0T'!uF`RdkB}˶.1OՕ\1ڣ&mQ jsk-1>udv~_HH|?60bKNPʏ74 + 6T|7Y4UnQ& jͲc1X4vP> Tp1H\` a'O4eO4gI˘|РkY1` 2Ql ;$ %P|{ x ]{;]8$0|jgu0:0xao.=4$+JO삵Hq48~|.Ft"w˪_/~)VRd=R6O㻈ɜ.<ߎ!twW(j{?LG x7!xL}>7%z{12/ʡzUڑZ%!.KG$JY}l1m0Aҹun˥ǫl8 N"bT(q.zo1؛§u,eTohOXLn*\fkKy-70h&(+"2ƨRPIMk-R@|b< }eBя,|[R*:`0KR!xl)*49K0Zf\4VOلqr eEqsw%~xv<3HSr^ J1QRZƥ:y 1+롎qW\4NiTĿF?g4..r<_u=/)xniOw&bj#(oٻOwLH4n#EFB9\;n"*/z'}޽}rt>ķF~F>,+Xi/?{1(N9~K!Tsc Gߌ~4듀dc0ǃ۫Rr)!1Ov?|?#jI#ѕݬƭ\mpkd8T7pk`Q [}zcy5.g9X3,/m4:KsՑRcʤFJ&Rga^jNi82X&&e_yt^DC,u)0 D#,_F菗TJ_ K9#s㤆 ;X A*J#U!Q+s/XJ<[|3C܃!9E<};4cAМ hclCo]R ՄJӁ}5PIzTBILP.= f4U5Ps|oN9c]6"8NپB{"Y_)>^6-JyI^1?5~˂5qmgz<"$LC_B–{[@A꾝?C8^ؚxu%O3bGs50l K^\)9xO3"rndgdIOŝ~R[8ע$Tȹ3.8VJnWZ?;"*han[`QRKXR2c>cN?3@4r4Dkc @޴ Vε^A;tr?*>fw!42nHWތ;2n}shWs??w[k0M]GlX!Mo%(/vvih c2Ow/*5\9޻z7-O|b?l-*tϴjkG:6V6_}p-ncFPiWRCiV>~p=xڪCbV<!.6 qt"m8;j)A]eRt mn G-GPIei<[8mp$|1"HBzvFx@yμ1Vv_El5 ܦ7ЊQC ^v=_s=2\Ietׁ5P-L-`r5PH}9?XJKMlMMFÞsA+J=XKX\q~ heNЩ8N.9@ NGJ5f00Z6bx;4:^SMe" !%3G,֡0OQp|T-nq,ѩxpYpR`bFx+˨ijA@ۜ]i Z#kzBPYK1x8GRE j` )@Jz|Y6*om7lj"5g{j8%m6 Z_Z",QQDM TARJJ341MLing \6nͭ`,gJjB#TV  L X֭cD8ڵy@wV\vlAl`=WFtXlSoe6[z+W.[r2bI@^*)&݂k 7 *¨&k>7Y[j0.w~QxWc7øQ!īf,r[x8["| RhzOޭʤxeeݿ#{<M,~Y~Ys ٣HKkcWw/૸y)( mķ?Sq:^R QɌaƳaq2EA(Gpv @ԥ?s^5SPLk4QKU L ZxVa{tq } q+pvttl @(Ym d|pAR3!)GnsRKZo5l+7eu)kv?](xr? ,.gޤ8\]L|xB\ wՖ,+7&6Ueuf*/;ݚbc:MQǻoiB*/3n͉2[hznMjYRݚbc:MQǻou*nͷdz6,+7mN9`< 9buaT-" EL]-ZW V+? }iZ㸾"-(]* =<{8c0e1,ܣF+R +JUPv:Opu@^=ZdZ?*+ʁ׌qj%i9鄍z9\0k^geg;Y؊@w;_wJ#t_7mU*yvO`$w.C*9*OYiˤ ι\n-rCb8v^ %gyFys;hR9 pm<\x63F.__rCywD<ɔ嬘/u|;_$h.w3`dp]@F3BtGs48 bZFwu>Τep3ZXKr#?S&X\u5Cլs -bEhQCH9LTd>7]{pmnb.hWXb)3].j\\1yʛYyӣ(π&g[\B+ᾇWfLRgZjK֜pp(:?A:G79ڢU$::HwpHUݣ]1ǩƇ}c.WEk*]REn\o Zc]:?8.]kyVGnJ +J˵.Sb#⍛6TZ;sC8QҕiJ:( 6HѣT,%CezGյ JUul+˓1 &B֛/s3SPNXNl9m-Q݀q%Q@!ܳG_RHeޣZXoaſ >WɏL'.j6n$f'&tj/ih5/2 IPm-ǔZ|1bhNDvHF>zUɚ%Ջ!~3_ B+:X4%]Grb:5&D)R~rpASX匣v^#0XTFn}D[h]&Dޖ.O$(4e?z{ք*Bz7fMlɴ8U4U7)y@ݱ#b~p*JQ7V8_jV#-yT0fҦn[oʕh[5DEly*/Yf1^ @ e ^']gs +)ZcX%5m+t;Ubė Et.ǽZ-qg$ tLcZ~5IKJ&^%0)p# )FFGdtAGƕFmG #*W ~+' QLCr'ׂǓ5",yh.n{mD|HNKx΅[p,3R;U(7=yr=3 pZiquyM+\û/ZKǝOQ*?l}& u~'t%wfIW4`x*7(.u \H؟l[ [xglz|&w2R9 `]z~qUݲg r4:.q%Ww(' Mmlgej}w5/Tw!Vn_=|#4vkս3p;Oqv9*!k6wr˥3r OqYMA)3;lOъvedҙbTv!Z[JsSB7dhfh6F$8eb[Lns&Rn[n*$ (xp乲&GRFNwzrFWסf.~.[4_hɋ݁#5ӯ 5QPaTiTo5xSU;U8$*z-I$bօ˔ƅ;mH)%c\$اc$ӑ,Pemb",fz"`uR!\^#lӇE,.x_>ۄjV'1nx!WԶTnqJj+;+8SBP*M7S@"P/2DFmKDPFʝ"j^B7Sk$5rSd #J%䖗;ƃN|w񧋿\.hS9x*s$RdL⇋?K{$$^I9NzM׳O^By&/cp =lwV.nkcJPx_^h*G/O>kk?G35|K>].nn?ŲL%SrJycgYA_/hb*/+c# {CR [܌'kUAP7+ BH[[}tz;1\ !Z [%7 V@Ut}te+ẵjUJݪe.)Tnأ(؞v[WPN9Wokry{v2*WTqc٠:QղAqAuAiϲA񫋲A5xwiI/ 6@A\ѹja%m\n{MUA@\+Y)kxV0r*RjsP4<)5R6tf t]gзSp(osˠlZt GÄAyruD<z_.sjZz I\a EJ(OI+^QŌbQ&K0nT޷O M6i+%SґtQp8Zd\z!J$pQ|('R)(L'ɥz*v8-Z f 1?N;=S:s gH"4:N(L PCI˝ZLwHod3Æ/ e5B"@q@4: \44+>4g8'a1Ec$CaZY9.LEq$3V\Y>< `t\$Gd j'hC~wҷt#p7@]^nƉR80DÀҌ,/ŒF| ge7rG7Y!( +A4^'^t]t.<20.4, Qjb #i6 )3=D. }(R.*.fR~sK~,Q]]0~b$rz@^|#Z֛[w9)+xڣɽ)yLWRqhƦWPʴ_ֿEB-{O5Ÿ+bq_l1ͮ/+?,l3ڀWލs14ҳqTP))`+Mұ*7+ti~Xqֹ_wk 3#?ofTz{"ާ _8QճZ+gDf ں3``D}y Kfރ'=*@`J ScFlj2gft=Rpώ /gA8?9fggSw|YvϪlg9W>5*whcT |;0x?Wz]Th Vx-ް#F` د4<1#L { F]RսPf# ,.^xN@jfz^Nc6C҇[w{Au>T, %P'G\DF9T9#Z/ r*Jz]S͌aEh9-"=ekEs;)(63|6ן? 5 8g|sQ.FruԭǏ iYHzkE7. c "MN;&BpnNNeCs64 c)"p/o'ޱN]%گ.:| >K=kuzD^~=N"X+2zՂquzI K#.y[OcHQzg6gֆR7 .h7%&c23$އ  c2VF;p #>dJ, wۣ`l1*qMy{~7MhVK-( sٸp=2xO#l@IS(hŘQTD.|t2ö5GAG F!VKAFk#2⌫(YhToV~Z(*AY10ۜ$YU>DoMe9,Zl0FN@Y-jd<2#0OvLNN÷SLTic zBq&xHj؟oT?t'd݌WْTs0=X1rЈ?\[=:[TZi*yKJR>:JF)1AWL q03"jn[#Ơ+j́ LHj -hGZ玔yc)sБk$454|r):@ڭr ՝rO;Bb(e$';_[%bݩ*sosR犫o*.߯jk݇:O [KTw& DC,1e#Ci'$gy޽A#0$SYa ϷqQCX7>apc 1x%ݵovw^κ8xEF_%-8*0k^LIi·(]qL 7x+a'mOHespg`Om!EYh0Ia"bT׾eD;8Ty!'rU$91b޾ԤÀQ-kx˂YϠ EO؛zM=ÛҘ,Ӑ)N{/WOeppHﱎ(={?9GFUڲP1<(mȩ#:p |e  ptLD3NVyL-ZzN[ЈU͌tmݿ=YŶh'>^%Y-"\贤FsөrB+A(4H"(#G4@I44vwK9^@5A\p1V\KvQ`Df"FeF Ǵ :1N"=!%hJUlV1X\<1JtJ%-yl) ޏ"$.+14P4G]P7$7݅L$ec?N&E&<(UGM<`7y:H:r 4~$JwPꝐjPZK>u[U|~Z _%U4 J"V U=POK~N[z2ߤ߿&8gT{d;g7dYd ҃),Vҳ}+WmO""{ڍJJ1gngZڭEs[2Eg(OCT VJ9C@2F!#xHTVhe"Þ;m?Y- Wة*Y)]X@0ҽc7l86ƢhD*+**8`"<,Ig.(Jï@-<ܮ%;//K/%Q^(`H;, @G{$VL_܇Ռ)-u(R#z8E6MMw6N31{ ,2@nR(I"#GK6-͹CRl(R'Rk=vhܻ۲k,NmC[g$Ӕ%S֠)}~}OӫvcRGб$0[bg0^/ X5-hFb^{3_5Ŀy,>K5wW twyn<Re(ufqO'[k"ެytVO j.? i5lUTA'UexOiC:,\uo H^uO|8_TɻzS|@C1 -'il*b 0il)ZԿ<-tq,1eȕ}m,߹l]X.n*y" E:>?-8D"|JOݺttal*k{I#gw{xm PKC[`_`˂(X_i$4H"Aߔp]BYO4'Bjm [gkK&TL$C؂l=G7?D`+L0)dLҠp|}{{oX0ja0p0C%Pˍ=CQ{,jޢ2YpR `w#zbP} B0#|} 6S$1JwrYs;-tmyѷTSPS8PANx3n8_ ܖ&ODxݯ4ba1.jE+U(PMyσ-j/anLݝ鰋/cF:3ھvZq*p1sn`8;^qs;ZQS/|–[OFYZ@w8b*Up,W}.y +hU(61Ʃ`.uW_LHxzIV9gX(WGJY/4ukۯ;1߿櫑 <5T=ULJo㧔]{~|X9SM*5⎄UFGtaOʹxҐ͑DNjvpY ;xc< ح#r!wYlhxE1`6ZX/ϳ_/1y1. }xwASA )A6}̱Dr1b݌vPէ|:GH0> bL$Ww>ͳ˫~xJ:5g쭼*^j(vw_ER!~u>ͤ_ Lξ(2 3zcQwd&qrS]9bʙA7\3杍FRE(1rLa細@pduztw(q_f*FG1b&Tyo9ܣ1. cQ9,CjϹCS{|jF!ncE5D;.iTX#R=V98=i,M؝frh-M@ o;MwFMTZ9Sq۩㡛r iJϲ՞XloX:PR TESN9,Dmk@;MfT*'{b{nN+:BNG:Nhy91VY10fk1VSu 34V.S؄ s^ecFt scI +e'f&[KkTAg^i8%Taw)c1j>` :k=K~VjUJmTaF @ 8~pm{[9lIYo{P2)|/ ЮTYY_~eyprV2N@sv pS43 3⸧ím:CYB"2p:eB([)uoRІI]YSXeeWzS>dRY?S`qq1X^#bz*w<]kTE J myW6ށ3ιq:ArF&5II'Flq)d6Թ*DcYhx%3BP5Mf0ȑ 9ߵnU`OYmE?"Z҂qBFU.J\}B*,^ Awkf-jjӲe.=]m$o)BUvQ9<~z ;=xZrC!vX6GX&G(e|qVu~`cwoy|&CɹA>mtCyH7ՓDZɇ^u Ϙ0 So^rʀb)IcC CÖ9ciaY7M [>ؘ+֎=DfO _8 *rB(UZ^{xjk:b+XIcwc EQh R(X H)ĎvLVeW3D4c3ۮ4O,mBIbaÞdÞ[a&kmŠl^oz!Y,^mC;,ߗ%rRKб$wC(#|q7RE(;s#N%+8 crs|p*U:''=l TtWƟ}YxEK>0}ʧui->tO;|ҵUKD;Ѭ0%uNޜ$얋AJ>vuG- qv+!h]?i[y׆84/f89,l sa2ɴ}d)-l%≝׀͢h mY0mbw~ P'_~yc" FTI29w׎[`TNq@!YcUPCt+ENߝHLkӡw7=bIk0duݥTdh?6s{o~ro6TJPgswk3 T"^_`L%@#1"u#&ZbS`4yR`[D xއd;Q[oVNQ:{jڱ*m$l8L0NLw UX4 Ͷ7bU'UKݳw X ܲGQ':f͸jzR8Y{5eےjyG ZK!I@Bk }_ 189;Wെ%eM@_jCᲅ;0e?:Q$QFxi'\2^,`?-M}.(%c,RVr rD5a+T׋{{Y *KJCZ6ʀo@cCƃ#hUNbv?ʐ`[N[!g0o<*HX0] H|R(ۯfq*r x'O4F84y J hzJ%*4`A(BDU mdocK!C; y$V#Y M ooEV {߅, &c# 6ND#٘!= CVEɋp28Qk(f 3[/;5ۗa mÄV Xo(w&w2^И]RJ31q"l)RGP Fڠ:0z!-8{l$f T*h,eFIQb*/y Oa LJMU`^LeT9l$v0q=1M-";'zO= %EZV( C$P p)tٛH5!u4Mjj#, ^M))fKGʯ:i_ߐ)(^BzM;1jɀ:$qZNW7J\%\#)6'FZQb[Q}Q5]+Ll6>^%bS{B./Hy!]ry.$[pZNOG[DJ:emNG|>^\B(hU?~,.oou1:r ^9jPOҽ$ Ѵ ŷ|7|xTgˏ>k`x'/X$d^2}Fv>~ om^/)?Eiyo VRR__}'L_],;4aQ=܊Ʈaz{ $N,|zXcڋH(ncט[٪Wcr/nkD28#8%o@>eVjEtd#NEnx}7sWL0";L3EtNޭ+T}+4Msk9F캀vb LjJK_*r+z5-5RK|2*nY yDMR)unPԉNzJs[) M|.H&^XC [ƒk3E1 ' a#E\">߯H)-7nit^ gB!i%ݼzb`ލ:=(?-BGmu"ش|Ʀ՚(.5!yږE(,,}kX^Xm%.GqT/ |كDI^:^9‰Jo-3rڊ(NYM1y]Z{c#wL_߻ޥ4ЉoBZbQ Y&)qƝvG -'#a1NBu#%Jd#}*& bw6-)>@4ŏ 㳹.fI~Rvo=f1)71ز#|5}CgrF5ydLĹ5Kէb=܃1౹g7wO-&{4:%w87e!{Ě̒;nF)%R<⇶;5ubڴqs,1͉EXpd-tRD)Gy¡\}j*A#[PjZmJ:rƔFG^gt]58;t[z0#ЊZꞵuߞmu雫zNGbL9f㋃5  K2-L-QQ d-6nsD=xt"c^TCș^ˆS g"KL._LdGq&rR>^Π;̘h?,\ #B۟ČBQccV,M$`a0BSS43 >\nY#ʁUWӓ0;D㟍Y q-B!3sN#A;ev,ws\&v^$.ϱv4`ΰB $YK8R4zCF2)KH`Z`Pځv2eRj fBsb|XG:)"=` 0āp灃Nqlss2tz#A@of ,Gnj'|\9䱥-|~x}#% C+LYT/(S F8r:i"R[AuqB &8c68 3ӆCP7WxfXje-T.ap!s+᳕O?>+ڿէr%fN7:y޶%~2{kr'=vgǫzЦW?]?L 7{8z;rogmmkwiogz>LTȧ^lժɵ#kfG2lcϓ8f1{Iy,'?{hwyV(hˉ)9!; q ̔r /:li{gf=j)7}VSVq%&KLy!SIZ`qE.1#J('L͚obRQ I.lP+\sYiN,8&Vcv|·6v&>@牎 ]|f/crȆRM J5F{8C%=ƚl%/ΰZknnD  Q%"`0;m@T GAw?Y>Rdp(/j9l-u\;&O7!!12fLuLV)<-bt|~qs0HUo1BHE.)o)u AEM {cC!$d{:Vy$lP/+ɑINʹrށ!J55V~&&K*;\0fE Ѹ|"}"Dzr) :xl2o,lPo!|5K2 j \!N@i>My9)?>e/mf%0I?H{3F 9e#щMBxk|*;%Unil /'[ؼ8!#܁(ǣ w ߦz 96m,?2dYS5-YQM>bgwR Vw$1vgZOZ*BXX`,PMƌPQp60+ CG<-9W#|_nX $碘>Z8?<19#D97!AQ*] Rgb׹݋.9Vc)@q~tSm٨7+}Y-ZaYHpٽ|@h9dA#gF㞗=Dѫ,VUb4p{mרjRcP֢%[#'{^:E&! r[KXVFm$uW&&Չ_SՅUK:&jA:rh̨>5)>đc65đ [U:*RKYqU7JX'lΆ&̲d*c0:ICqRwAʒ [VN {}7s:;h8K4urLmx[ש{}\O)CI|˧gC-|=9YjVyzhvq}"ZMٟI3߾ǥ㫋sUmm2 ;wKn%Q(FT^ݫ?C\-L 9f]>׏߹㎗"Ɩkz y5FEo>YgЮQ{:v 'mW>0]i(t!1:{4PA;L]aof'r ׬ƚ5|*._Ƭak'(8 T:}7*8J@LHjt#.59$x*h4"4I)rpJhM2L-SI3IА:Â6 GnOm~ipjY9ܗDD:;?y!©Yf-gv 'd%}tOIB2Es)D wtQ]hkjt2qCBLFٷ#&sQuu@lywm]>X2ч(?O [S԰:5ln6l]!)wQwM5!8nFxN%@KI:x>} Q *֙NTd!N:!n!@g*L@π@; s$֚ݫB_X.( t29vwO9Ԝ7A͟<*.iN: 2nCpeUFJ) >ja 2jG-|ɋTJEN[Ƈ Òe& 8ϨEM%8RG]7\U-p+ ъ{bjij Y*o_6 ` U:Zr}T]DxEsѱ`$\ $jaZ9k.V,h^<)pbhƕA%r̜Wl/~;հW!#K637>3r~{| Ggrw9?Oqɶ> b֫cIhY7s1es:{üreq*kxz7w_z8tAFG$*Pv0n>/\xkwĸLEoڵmNH_=2 ɦ;i-? 冀|*%o<*?0 ־s l%'9S @?ə\x (y2l£7nh(DBTʣ%@#kgX!"׈N]P,ւx#?6`[s;hլYj؉&5}"BtBA(3T&,2Gqά:z$^ڛ@Z3%쓒 }$CaT+NLz~}?+{Ct 缲5PmzU!@QUu僌U^=Xќ4x>mi_Uȓj,Wӻ>eNC`: Ts,~c,~ ]<"˷SdػhZme2vTM TզY7[Kdҽ]w,w' C4<{4:<4mЬG}t5ŌsV>Q[rO]dڊ#]oۏ;h5zJJӃr7]U(]Og^sdw2CrJ:!Oa  {i{E4K Arp<[Ot\2Z]A??.6.3 n huQCl_bAiXhk"F2c\.jmS Rn }>lí8VݗLТ0Y7LMԳ78A>YYܳQ_Ûøh+eo1zH ARFj]_]B8ez .Q"9U_n/6=ه7{Z %W7 Nt{ 7v( q]f% wE6r 8D Cu;jF#!T_xѡG(y(g^gø-_`?RGB\E!-ÉvT!R2N7{ y⮭%ݳ&@mζe bX%+I[Iz)]c9^' 1(R2܇{tڰs1)jFۛnّh)`>&D**n[I';mk&gN w$s0<dg+FZbd0Y"I)\DnJLT/]wsΫaysusn3yº_BC:o U*Myѝb=.G1dGuɸ&%=KC Jt%U=Ќf KzP|&ku"e>,BED,nܙ/Zy=v`@zȺtiT*zTc΢uSq?dwZvp*$.]}\rPgz")dӢV2 !][ƬnߡE:IӄPhũ aJ B*svȄ7C>[RK?K/qʫKdi+9Klu95 C>[<ɻBYK=V=VKȧ5@3irCeCj9#ʇ+$16'2eI0+;Gy-~bp$9sP.%!8NY:ґ$yq6Ly/^'er:G9 GGw~xO M4`<φ# b^Ӡ40P,xNt'a3-;$%aݳ$FR7xEr;<[n4Lf-Sw} ҵ@ AWqQ&1p ` -fjƠb(<#^z/ZTHph1&P]%H]Eg;ё(k\~C켪$Āʾԑ|޼vubS a00I1D#_)ue DLKT$ixO2 +B%@!0)\{Ü #,:pNiY]ͪQc.DiuRr,w*KbNd xVC l"E$Z/XP%bBDհ$͐h+wwJL|XadH y-뀀{+ 2gZ*Yj&sEk2}@,e>(ZT{!` x_"banp cR`^hcCa&m^HDڒĹ"xQ{@T^{ĴM:8x;1K]St+ʙR`="=21 PW焁@WY*lޒ)io>S2 V)jKЊ1J^~^K;/u$tQ#ŘƔoǘ3uߵ7K.pq8 oAz~D`=^h9nS]X!=ͦamrOӻon%|B?Fە_UP6\)Z] {O/ O\\W^y{3rqc'IZq3@ȉ<N!;O~+6,x]`lp@(5dLFog_G}~f2E\0#1걺uʬ)>ҪFbTڸ;7>od@}oG4Hd u pUDx5BZ~5Wdﳐèa\]eS<@ f\GlmrwQ{'2RrULGhrnݻ2hhDzL8/4C fԴ ֞p.|l|TT!O8bڵ1ﭹ< ;'$=slͧhζnݻ?Ά9B3s0 ׫Q﫫MxGV?n|e֏ϳjf(Ivre׸-7+Uąj(7/@k.Rrδƒک#|pcIzz)Is:@(> Zm_ۈ~OjwNpćQK|/6b+JSN]ɐR3~[p."w)c%?:ØMHS\niSo| „8{Ni6? aOe%S{9hEuo'gz=@ ֬&&[!j[L8 )#^o:QH(EzAnw׭{j1AVLR{\ݲ65#p0[/auK~F#v]Hp! }LWM~ bdPT'BQʏׯV4eJ!gJzѝp[5rǵFq71qKZ*}!7U95fIʐɶǘ=zunݻmR!QSh[_rVĽRX˳NmZ  IzZ8quڻ:7:HrzzzH 9hzF%k+s6=" SO]Qax{ǤHqw4ګ우˔!X)Z+p$ב !kދߩX`nP6zH2})zBc^FrϩjR3F`a}*r-x)taIRHNjb~Fl}CS*d%X5肪T'?T&+cNTa$Q\3r r9ey&- vH25D"Tx^JG3tf.e< Bs".Mhp]-pi0Ng=I[iPXÄ7`*m̅ɍWUT;N㶖чr7+q% E3 F.ܜa(G&LX'{R35 AKQ)9l !ݔ}>Va|NTZzЁK:?z5|bϰ蟹6Ԋ2_1~_9I]|exe0j0|x~^a>axv1'wM~v,+7$*%wCnĘN;xqSEznnmXWn)6n|m-I}GvOiikXֆ|&dSbC&(wK tRQǻsH{n[ޭ MMPG fwW;w*ܭ>t",d\zE.5?HVPk3h!͍m66`3 㥷z!p,8J?g76xq.s\z-t/US,ig;dh E:pubU֮-=v]D}C\o 6%bGon[lnEmnF5>v+pn>8&`-mi~]W)>N]GMvۢwݾ &8>mBvTQ{SSLbuGi_EpjZr:+zLלD_u_h{ ʹ_J- "KM rSMn=@k(CI:8yuMCTg>Rw pĩ\u~thlU!CӕX`=,>n=m3(mPqՙ H&}l;wF[uTi7x 4`W{Vݿ4Gr Q\7ĒюԞ#;b}uW29ػ \cXԻU6 Ym.'!nc@E`T/cuu-3չ0Rgb88H>Fָwo߿Q\N?@:.EmuuJgEX~5tLDM/JRzHJ MiyfM%uP]T?uŕީDuCk푦1^.i{5o,,M%҅tBE}-|ޜ&Esf!TETF CUV3+*":}QJ3f3EC*m;b2JjC +\A';ɽ8VIEX)j/WZ(^Ir߆B+eLՐvo׷lV'p$m`43ql--=iaDECU>Dy9)!JrR:Ziʗ򙷛#w*A^~w?ns'%nЌ,cl``k#ke+]y\q=\o>WFºGkatHkC~0Jڕw 8! 3 C{01h~%]r&.2zR_rɎ˝,/,⫏~?%)kE0DP)*u)k#5- w3FÐ4w5cWW/>eyyӨ{/6jy:R(21M%J_IZT 5*._ PEy[Kr$Rr2BcI+;-m#[…v4#38<6fX:h|j` HJ)KKK@KjQpv4ir!s#Vdƌ3.ˈhm3XI@Kco)S6~6CaQCؙ11N8H$4#ź-.7>ߓ;q qum=yNLk>[Gfp=`HaQ`(o~lc|3b{OV'yE>A}RK+\CƉRӑw|QJ'C68]I,=ZrSk!ņI؂b䃉ʽ|_+LRVsգi$Y#PZ+5y䭻g>ۂf@BxY/xRf2-Xp<i˘ZPQ.AE@vD8%bl_v F#&hke*a40TYeF++KT @J :`1VI2!EWI $)m$ÎZrHCfU,dt4ȸ WHV, zM0T &$`(|PL$Ne2 SL 33{L 6Hhpn/#lkbٻ6rdW,;y/2If1ٓ`332Fݝxב}$9%%٦n/ȭfbU${\D6LBxS~UMKN]MMLVQJjεX-s}D%F)0޶F>UsJZw4(DAih7x:'ڞ]z|֦ ~/epͿ{{yUD8FF ]~ţ :v 3 SSIú{z=5޵$&(tk8OBza±Ru+_ ow+`Gt*u1zPI*izP)o,&*~^,r:Nax]n*#^=zvN\ծD_ ^[OXuO\Id!'n6{ GgУw tbۨKf;z9n[ 9q)o~)}&1w tbۨq:p ozr&eST >n)6GAĶQǻ:Йw? RXȉhs?s:e] NE0QD qIs*"ecQ2G`ޕ84?\Ch.-~bq m ^\v۠@% O> vëwA?Shg:FBn٭a" Y=ԁnPدlgwtߘW- [ FFl,1yaQ$1g8%Wok$'R?/o Ar AՒ{FWs&?gt6h+*=^&)C*kIX(!b+AKr.3 E2CdDk,Piy8e VcfH g%)Bzu:$ho> py.[~_|7b􅳋KǍ;OACf3J8* LpTaSh~%#B!!g&~ޣ2gTmLNp@IYpUa2X$YɄ1*ᮬ."dci0)Sw"XH";GQ;5A(SÔ;{#6~Tz+X[RNXFFV`!0(рߚ'Rh3n1u219Pp* gIDtyblyD$ 3V6Ĵ;z5"2,P=Ed=m)i+9m-mhtGAz\b$"Q:b^ %Ne) =hHtP'Pkz'`9$%N-Q!<46Riihz&9zTtPG=m9[E>tg$[Y[}'YVAu|\k,DzjZ(J1C5!Pg! 1(GXvmGA}{6zPqyGusX hXj@B +x0ƖKmv+s2ݕv2)q` ]iƮW_ II/RӔD}k*Cs}0!m-إV/i&}@_&nsP )Dg^ehr/[)#<ֲ~zQ13$~Y U QbAETluXKRY9whYRXȉhM{ͮ[,!F_2TtǯRXȉhMD{7Fb116xvNm{s-Fٔ5ԭ}&-1Zk޶{N ݢ.XȉhM ql$&v={ќs4)G!4+5p{O"'>s͑,RIc#AzEXqzqBʑ@whJ1st$Pʏ/N܋q9NH1'P9N8 IGv|qsr$0<:8OEㄓX0&N`9N8 IG9 8!H˶'p"9N8 G^ ]"W2R̯ߝooW\#Ml%(4Rb"NzD> yي i;Bqs֡m7,f凉4şoP|= :.:4΃ 4,vns>"z"WRK.yFC]P?uկ7K. fز5qkk.kkBZ DŪȶCTlUKV'Fn؆Xp !>vAv|d"pD2:,m6Ř%b/U c1c,~xs}͖ӆo_+4Z5O7צxDeQ./db++++_u7ءw;AJd .! vUZjF #YHfZseDaUϫ[w`u`(WM} )gڜgX~*v \э76eSQbQW$ ub%XhBe99kh-r)C)Ls@Ț*`c$T3If֊2y]%`Ì.!z(lv%fN]MI' g;$eADR "dTv2U@axB+ S({DSWb5a"snt;6l$[]t9/nŦ [8/6mw¬=`*eV"\Q YAB;3 5@+D GR/itCvCL]2 x>WU<~ )Vx!as ְ So)^&~pgHӗ"ލyqu.JbM";o!f40XwяnO OGo?^֟Vuth^d7 eJJaUVt 9u>VTI4gxsEV0 KO\uLJ{jcmq!/֧],ngeۛΛ:ocuѯq(H) 6jULƤuuk.!3!E2 $֒J `e w3Gp[1'vF9L\ 9`|޿:/B1ԣd;i@Gb>ׂ#bcxvB>).wgn&$*)yJzIظX|;jHE 2 Bs#o1+*V@W+*vە++*L#8b.dFS%A$WgI)s;eQir!6uE%,$œl\*0ek 5WO y^"90`(:cIY@ܙ wq !n279dgN|eVHS` R2"G,SBص Θ,%G&޹lfݷ.V_ޤb1ޔ7 kHK)׿\fzf~Z3j^o踽wzQ̿llWc+}w߽$+y""}b4?)+|g?j/)?[>_,7(cSBVd{t߹ݠ>ͪP ӗG/SJH5^OcK)#_[VCV㞛bꮒ~.Y9.2Obnv8|sqCaS5ȫg|ZPb+z-Vz}|}̮l$>5eо/L% SK!8CnU8]ƣ`$f@dB{ M MX$AENm5ev`p$OM!LrŵDjui}Թ!K8t+ߩG&o9  m v'ru!Y<'Q9>#"PFfDaF[%nq@c *U2luvIFB8Kn|-q8,g!%]n \vIhVXIލw AKySZ\JrzIYevi0LfZ:J|veA/n i]TA1KX8ym| 2HGP *WXuw5zfТ|2,MJTN24`i6( c*BgZnZ"+:X=63پ9 gpRJ LɮP mڐ8/Gjc2p `| {'АRx Tw9YI%O[~>ġr`ʜ )HR,\QO.FTŵVi)'lH7ɹٻqdW ,Q<,{3{1=^IuIOS[儲$[NiL.VWbUXQBA7&gprdY=%I{hFs臋i'8YKVFQmKzT Dk6# ZHN['`yJfH 4i'ʹ–S0bmb5bg;ebqDqc;N!gC +bWHe)ڸO!Fk޵CԐU@W(䕨jc|MgFTZe JL8&Sd;{V >O5&VY=),5eI(0!&RI| ˯k1c?{ү ͑숈ҟWJWGt3'/λ* pc{7$NCvZ '#3ڼsӬw_l;?j?R &JxʝJD=3ޠJ.~/[Z)j9!DQDkZ'9ّԩݸwF9C>rAl9&NH-aRƒ $4Z1ƹ};-e5Q6;;!iU:"tκvbÁJtF_R,C"53]^ Ҭ¸0~A4%rtgy!zgOB邩F]NaSs X2g#Ps$z / ЙMo&ʤ$wQ㜌" B ^kTo>Ecm-‹ه^u$.ŽPYݡy^]SuK[BFGPq%B/ۨ`ɶKu>M< cX7jTpXb0u>d>'se|k. s&UT%LH0D)PƈX3l(O8uźk(V|AØQj-o8q8)33N>PD{9}Iܵl&QV uJ=pĒ6=¨4ϳcg8BM_2&-Zwsa/^$E!eHmrQ!u޶>sg;~NdW37nyoP/Ґ{1_~: 0I??vCj1 Ɖ|*2!SiR)5 qbMіeRfSoa>7O&QZ}ћާO{ 1f]o,gov /:l沅OVdlrmcA4wW2yr<|#ZHjGH&S#==uHnƇK55і;笨^Ao%[Pтt`6ROS o7"IcH";]:+m{I-Sjz9,1yZ&u4M7?]+ŪxYLE˫Ū։7A;0W8 0’HrV{rH]D=kں2yV,ݹ[znyAo'G_ܙkQdSHBjag{R5ǨB`Ju!~,[`j>7[i܄D(K^bXA4Bc!s3ɘ8N&yNTr*9"]:k\t됮q]L(9D3n\*Ͻ&s#Iq*) k^Ye,"lu9y:'򽥘+ɟ;)\zPn+wGRYX,M|P2<8JPᨰ<(fŽ8HCz>5Q!?{3y7:*[Pp??y;gX9[G!w0mұhL *G~~5{xr |.DC6b8?whGn$?Ok)"3\GIژ=u 5v 2CX0"1$36QP` R> Wq+Z2'=~/<0͇x p-PAD$&ْBI/?}1x?IW}'Boſh0G#x>XF[] !ݻ0BK--pc1ihF^oo}jYnhiAijE[l9)L: `N"20ZSO)8P nPj-w hBY)ˀM[0P 0k|*a 2ae, W'֞X7U6žW>P\p@q(p%Xs3<3ɥ \Q 8̘1]*:\YE•u_\馍yZi?w7q0x A)R EѰ2ɬF; ڶ#!R(KU~?FeH$_/[n !hky~z`0bgQ.1ǧ1R CYEQ,C-ub6Bi&څEOj3рJjHi财 Nmq& x* 9hN,L2B1j 'Yq /(d.|B$b4ƔJ T L V)ej1CQlPŰd2z6DCv(G9=ˑ J26C^Yv8 Rl)*%UhLtx…xz N -ĕ< (5vU?nG6+7rzg[٘OǓ7O5'pp2_'7ϮYsf6{sy p?Luypvf؛ٻyil ,pw~"PRIYݷ  m /Fe_ޟv#[=Ś|_Bp!X{(fx вB<-#/Og`޼rF |r]LNsv,D_P$Y^.J \\ϓ`z" 'w7ak\<@OVRdMJT ,1VcG UL.rˠ.cF\{s*r X=ӰIe:³4"RpcklW^p7|BV)]jqKSQl1ƖsmF0"뷕'ɫSid8AnoM =T9Q}o7dasgal-A%%@ڽ!F.M 73 L,:~̧zEQ)᧚PQ"(HbO)2*{"'RByY^eRR-Ǖ^ jG=?K!v S6NEdޥas< 5 +V 00A;PpK'e e\풷@1F 0Dޤ,dU*4&C]kف6$2*UmnɠH$5čbPfk?4-; (sݲRi8)Sk` VJr* GT$( zYe9{tPP)mdlo:AZF:\ ;(f}jN,0!l;b&#V WXiF#̂U8E(JhEmOiĄ.ď6}ֽr9 W`ڄ29<0s2 mB0d)2^#aJߕ{$af,z ;vknmHքr%S1^}7'{ Q|{B`G*Q4k2kݻzEV+VڻBQOR30 yX87ky獧Zo$\˹ۏ|bѭ͎8-bQne_7N7{̤AזgcZc[q "[{Y +yvI]qBQR`48URS$e mH[ w<ŧ<-g{!#%YQؿ[?%FfLl-06H^s7 luKA-8{yQsmKhb I"2a3-727Dl3VM,3[ ~8 InQ|iC"AǏ-Xlr̈́CUHjŎw;$\U3m#WCN~lu`{LǓp|6 EokFe=9#~1vqp6ؼJdII3h[A`K&UX,x0"eq>s98{1;O^I_L',*w>yϣ}}NYTJmN2`WiQ.kL@ n?'b"oN`uNlzU K># G~@YN#OM ^t=wQԊVWwgNf7?{z[:= g/Wtp'1H#{3{OOi`{9zS]SMWH;]YL 489Jx7!yS c*ԧXrްS(شHuV{#[9*PTj93T9b6NЖF1͏nz)0H3, &h4$#ЕB 1"LR X(a??+~b4^濈=( zr'= h2mV8ny)=!BO1:URd )rZ ^MdS*\UBk7ƙGI嬏i k8:HE*c`" GbB)yDbT9P:umf̋Q5v$ mjC%۫%MK7.tw&걳Tޮ|Sld2%AGBwU.R@_ؤZRN .<#^Q7VuS :1GTPL*+8Erze]s1Dbi$kƚ">^'Tyb9kZ/Y,iN,U;y1^2%ǯPPd$!Ԩ "DccXA腠l'?; }Kw1(1馼VVx*uC 6dAS)#%8.<['Wc, eH`[s܏ΦWվ |v5g9J<]]t\je7:vh QEڃEYL&Eb=6fбf Ĉw' 1<2m/gN@guXgPO}zyg}_#+eL45AVV`8ծE5n~~{XVں͋Q):ŚPRU]B+J兿ZS{xè'擹[o2b=nFWQ|b@&Dy(EKZD\]?"d4/wq sWGfRD ۸pLQGr9lGt$U曆ˈwo0E YWeA/w?[ENJ=;yA1>w;0'ˑ'?靓M7CKBsfhiv\huM{- rDAEI<5BHQ(L)\Q^`#'!xC$^,;x*ѕ N4פ0A&itI$ʼnf$L{۞vd'yL)lGĀ,5Rh;AdUeeD,{@M%-KG/u{8h$2~:+kq]7ݛD}se-atۅ,`>H;H,[&#K??<zŻnp["&,"$!ɦH(IA@@ŬERI2`7E:]!ya;"٨$N6aqrysqu(Jb0 |骒W"r-.,]J}C9T-xF=ౌ$9 (Hՠ b)n]ay ,X6ퟷB8s>L0a?dfo)|ZLH4cyRQ'¾_UϛWMt>_^X O],.lVЋ*g!vG8j_Ve WxkRd <~ܙh6ϣEuj,Ϊ,U'W(J<57LTV0cN$&*8]/R׫sS^w))3\-N{b TŤ -NZUyT: g7fftonݰMD(F! ʟE1Q4qv]wkH}_72Eހ ;·5)8H=*.׎/.\#lyuq\3,AI7Ay[(_AYI/OZFVs$mE1@y+cZWAab G$X,-h&n>JvfhJzRX#MIa212)1{% ~^G0ooc֕|1$R]ԕzznQl@A0oțw$,+a>3)4WM~$\拕9^xZ?AEK3_|bX nm}p%aM!ۓ/aGy`WL2]KqR Ԃ&TBEߗڤ3o@GvG[wurw*bz]YthPb$Zw:R=50[/ۻb=$-Uxgk'%2 mڮa_{`R1MQ}uZa;i?3uOvۻ1Xwm?!c^ >KKp3G/oEHy; A^𹔶ф Ӓr5Py5r UDc6Bb:m4nA[@G1/Io.OM$9fx_} WwǑGBTƜ#0~MS̱ңw7&"$xcJWBɿHN}Q[q}JLy)ƆfX4I\l;d7lBBlzGr8RbُjG y&:ݦ4%č#JXZJqaB9݂h9LxƓYFd~xrQ3F隉ft0<}3H+4BN56^.f{6Fr3DAQesjYp͂ Y1akg[:R^ZQM"z`PcGpkY k(cV[I#fmzwPhf*o;קu@F+ fDDJ^9YQA(pԑrao9Qi_qDϨN^zi*Rx`HX<~a Zy9Zas: M:b7fi`kVBb^s>"աt7OYH2 bm0#3&ӵ_vNfN} դ^F'b` ߠ5P=L&kBFS0͝}X.pN'L-{/(h*[P x7q$fbSLJOy$E["y *n-eⴔܱ@:o :AMpMG=M+6qVoA; Z<%vrh9w,6`4Z!xٛ7cf"F $|-\`uN(@m';3 Tg9ɅfC,99]@jtXI#Ya.`8_rhdZ1w' 匶Bt2`:9څSt$hK#*MBC 9 Z GL9CpB֥) HgAr,­N7SCzinCd%X 9Febi?erA>%%e6{,& 'f%Mq|~SznP=:0`$1@f7mT$Ű<\JLW%%*s0Tq3Hƫ螩LS_ue,.=[qpW!2AgC?PE"3h0ިhw4D;_} ˏYKUWZ):IAO|@]N[b?!<εx^P0MZƈ#@"~7%l?/gPdoYc'ý n_mt﮿?[;/ww׷?}䜂:?k܉?6p1^w܇b2 4OVGO< ߧ0Ŧ]-ʇ*?0Y7LTSƸXPp=1q~nv4]̮&y=g1 gSTȻLuml, N @/ɞ򹗋S2ATrJЉgqJL%0iЁxtW ӪT.ϛd1 9ag8⯔șEK P1oWO9rzS?&C&PZ9Ypt1~wԶf T`їedǜ3C&&hg`  8HBP&[JOk i;-8,1X ImV )EG D!̥\6CH5fV:mhMU$X>b 8F0>G#iwe ŌWTRJ2.5= 4ShxUV[bZHrf>=''V~/VoyInkjsNOG7*9%tR x;~|?# /56 fa;TbBͿjX FHX0I«b7 I-"kU~3mf_~߃WBMh)U9QŲ6D"'.t'>嵖%F6Zpڦ a C}Onec?o_y]P `Ʒh̵ [Br mm*vJ8#9+ȓ3Xُ*19Vǜd|}`2r.n[Tp{p-[)ZALOu HO;zLj0jb;R딫eX&dqh E3u&ek46tu6XНlzJNZ8-ɉ2rZ3[CVi/bN|w>~VRyPhG1wūZgZ3Gf$UjGC fOS7An--%Z[F{>V,ha3't8HH[sR ᵧdvq/#)cvAˣc'4V|HOgo+?JsBa@}@%'j3D2f!Ku] F֊1RC]u\DPPC%5 ' 8BVu`(zN(kr((߉` 5,̹VHX;~R]'AMrGxDq'x?ְ廬F! ̀zIY,Cc*7l3I9#zF?h| Tߝy7 am`Xo3[:s26;fp81 #9쇉s΄tFӉldG;.=<+@}:b(f}n` Z)Ʌn9 JV$5 ;F'VHk)Jiŧd't_sqB}Tg]qjiqH$}i fv:GFVo}Ǧa^kƙXl΍RjNd4wQ.Z#ٰy/Buv{~1Mj fy55s~U~UCv>4q/-LgH=nl(N:asdwW,nGd9/#֛7=`˞zq-Qi(R*gmvPq=< !Ы鰮\^k{",?ILN_>,NdMVo4K6Ņ?\-3u,v+=z[w߾_z|Yw=-ࡾ-cJ<.xs-n)~q{ӄ+T?Jl' ?5C˦e?o g[)=rw*Et,䍛hM%!Z^[Z{DVA>cw;^)SienVjX7b$ޞwc*[)9S1)[|yLVB޸T2Ogُ}煉cndm/:Mfwn,䍛dR2"*|YDv F (Yh4[-*M2)@Um+bX!Q. W`@챠bƋޣ6boE:IGyYGII籔?-KK-yr,&<Y˪ vlë `9u5bsgIE̒>~H) (KQ/+t)#5@^ovaB Vwh|]åB5ʀ*gyJ[vlͤM.Btn墻k]O#?/~ Qq[t0`OZqlIiEjE̬}ޜ \Cc3ș#,2S9ϡssh#:?k׍O-7\޼|mu(\Ba~|+TD|[E%q|=M %Ƿ 拕G۫ wwjau|>;L?=L&~/`l^J(o]H!px}D2]XѨdv# Uc3ߩ ݿC7UL{Y-:$CXL:@)%sӽC2L/"$iZ,gp%V[1r慟csׄtکUL)ɱ39%hy|M9R*`SQC j:]2RJzDF8u\y>d}QOBy+A\:2 [R~s}$ gW.0 JrGYF6ZAga+4P1sK$29pa eR-lQ@M22DaigmWhLA`u^ /$ V#>Wv cNj=9J8k9z`)0B|ED͜6V54L@)gv&2<26Dz6ueNCjuӅ$ $ 2(K;@ {E4 tfl]>Ism3jpKͅ: 95a䟃/hs.Cds<6m.CysEW{ũWu7Nw2΄p>o•TV=s4E "YeYú! A[^~jv=2>hRsH{pP.poRAtlװ}2ywh|_|7XEܝqbcZjksGt}mgkIrX.ivN*N)̟`GھWO_@0V bmw\-ơF%P=bhǎ)ii5ĹK-Sv4oa;wok;"lup{E+:^j֕KVWF5ʛ[J)6?y)]E;^jψvvvLޮ m/;Hkl}^?+F$4Zoc9Ŧqh{9\8u>Լ4"Y* ^<lz7XeA4 WNb`gٽ|bj`z06N iJ響Fc^Zy},%7nv^|dɡq:n\Kvc|rV$L]7an~?-Hֳ4%yk%U. &nQ6źInNlu6޼NKlM{@B^f nTU-Ӊw;)yS-p)]&JeNb11>x?Nk-RwKa!/Dl s;9E%ƀ~ߣI0(iԄ>#-.SH_ґ q:Vہ=',tvJu!R-NH98A- }t$x[:8A8R098A 8RYΡ;8A(p'qB ('8fcgT%U·X16e~zwĉ"dzT&J NM`?,p9lUJT-[JGڛ;{,6׎ ޼j>N`-^~zyEAS8wUK$NR2v1@B9B]uI]%34U:zM{ZBtO1}- SR숢v Ze)Z9fU_\}˺fW4#\M#[1 (h:u|fynyfFWe3$oUjBref$-ʬF{qCm^ҧ:\'/v6ʿtbשa43F 't Lf!9fC9QrlNJeJB.* 4"nQGp_Q٨K.X *1p,Yıd!~D} +{up"i&~$eﯘYj1%/lәOif+VuS v8,1c̿u{3ʿ}LgZ8%jFjg5{5eV]!K[W#QPQjlP4O<)mxiXKPC0ã+dcO)Or+kCB\<_j%/?՜+v HAbT0Ҡ;J p#Zz◻Ȃ6UAcT~L1Bz"YJn,gO0_BvO ^xaf ;M,_-[ԍoQ7eZtQ+@HI2d($*0BC)cX0044_y>XÊ_[IӃ؎^Vt'a}zHld;b"l%"3ˆXUkq6~Veb xfRYEjW8 n*CV b DCV)"4^|nw8[pc>' elRC>^6e@sÑ@75AQ ΏνpHkk᠛b^?bpPWO W_wj1itJRX,jG; 0P!m moSiykhKbB 81)jrbCA׳?Qpd<8 ӛ|t h1Ll$^}Kmoߔɂd41:sx<#@ZF0#2+ђ 4R( h@)%2k!“-8d@ )۠ ,wW[K6w^ GF5/.|:?\bAiAYu g-(U4 ",9q@x$I[~㤦g8(z{m!*r?fc&V fGe~ iцH0Dq&saƬ6\(ĔB+CZAHP:V!I ˭Y&B-1΋P+fĩ)BT *jjS-yG3:A;HMBX Bh'cS?tfF_=2nQ6D']&K<;'nKGQǻpw:md3'n6ő8|[Mx.yR̚[iKGKI|٦%+cQT @Q(ʣDqJ[|ͩƨ HWQ.֒q֬!HJotutIv$ هVY`a |HQC0b dDdݶ6PBڬn2by &uSg=h9oڿ]*3"%0$1 ͠Ֆcrw@KbA Bb*+%Ǧ(A[X9דo.ſWLpJx|j@mcS=&zY6bn<ԫm#e Lf .9`=1,5p rb2RcD$V@]_9%!OZ-i K7̗e$A@4v3\nUXZ3qXQ +˜ $Ch'Z!B8FdsVJ`pscp ?drc7>m,fՊ4w?6ZewPR?{ 2`%Z_^?^(0/x>j}=ko[?p|2[]]pX x]3V+LW0|շk &) J]x8V4JfxHҐ;׳{!iSKH4m{b][7+¼,pv5S !`qΞqy8 I=عH$)i֥%78\ZWEXWl-tN'дB@7/'4ڽH:%x@zPF0cԌ93E9U'1j^J6WbWd-^C}Qן&OˀM[g>x{t*x A#W rʇ5{~?TBGA#@$lЪp 4ʖc@ʗa,Gc.(cʡTJݪjSzFYu7WWO2W\5~ 2"3֥eV1OE9ڇ8hũxZfL@` cl{ϕ$NI+(xA붢jŕl*N:4*MJES tjPzN&Q,aZ4ZTceGEf1$!/w+T (;9r!sHP HO?E ~*jT)iJj3ix("J'9#ӟ7Z}C?:G#l#z_(%Lh$zI+Vr+\J/pC2&5> K )L^@Tq$d]WjwK] ߍKL;AJ /}SzXK2IO}ZPC݇xї//}1|d{|o-P S5PLQ]uȰ ۅכb:OxFh dUl:<0&ީuC\nM\+ "|:?^R{ϟ^t[3--G=|NFCpCS9z*xMv<[Iˉ6$ r50&oZxsm00-\{}z>!."-Iϋl"#q_ /gY|DK&{kz_LWV GVQШtpmӲ#6in C@€B.Ya$ r i A`fZ+vՕ79%jU*9pr< BqX$$KQY04!^L{d+,g f V̹'%STUS?gTH*% p9ϸľr1o()d%r#40C˪A꤂Ϭ"1 HMFz@[fuNU?*SvLrPL7)CVS"}ei: -!_sxkс9 F{O/ q*p{NFS^zm}Ywβ1i>F9ĩC)'@aY%bX,\\2[Q0hfQ>\o}hCe kb{;ի[b=94R5Y;MYP[Z!m Vޥa:QnÇ2Y/7$.E05["}I ɉ2;J#KM36Q-"6c,_t@Frˈ e;&-iX6&|}w',& PZ.9sJ6f1܎3IXX Bws~WR@q0G"S̨5`W/ $~9?,tFsYcR$.YRp' @J.e! q,߳kLp6j ,Vd0$CqLe#-*hʆ *FZ 9l8)_:.eP%N[ε΋I^dha2m8c^:o gSik:׮~5t@056[JöÑDZw%!gjgw8}lorȀr9#7j fˏigW~Q[T:f\xL oSUަ٨L)FxG)?)Owd4NQEiٖ Q0†£]2ag}<4.%Oi I,- g M;65wcN܍N;QRQR鳎!Hu5j8!tm7HR&i)NdJٓKK֎Z2SFt^xf-)^jȄJQGc4dsWƫ5Ÿ)9MI6`*]뵔1Cb޶3};휁;h蚅LC,j-I1V3& X㑯ZdpZt6:CōیOp K_+];pZŃ) h7FU<"k*yyFŬgE̛<s׭yxretW2xGwKf1 uU@r&n&\%p+ fWw`nt\/7Y8.fܴys$+$|5}9-IM|vt[kք|"zL)BX9>] YhgLػXhk!SK4={o_SWJZ:*ferjl9HHɮhswVjѷ~l2t͊raXQ|.77$RqL5rB쨃3HpzaB;B{_aϹv܉?-*d9IKxw h)k]v7 Y@?k+nmUf.9?)g%(/?O/N]BHa:D`%6 ׾50hloJ^Ϻ`U |qw -Xy7:$*l٣R}~}(9 ^>݇'fOՑGV7ìcWyf4ʿ'|>G~oyuOOoP减?[N$`l1-ά+)̊a CP9x|FiȄx{ˬd[ IAFB &H' DQƕM*d&2d|{벐8zxz<:Gpsl՗&Z }UZj1T (P; }[V q@3ô'9W@ai)Ae(0L0*da\&^uV+Y2H [QԂ($-x&A(Err[+'͔ -E뢐p,p2 : '.M VZsȝ)Ϗ!~=} O ,v#<7sUcw/]'7Տ^eC=@S 0o/Fd@ [2)6L[Ԅ]q+Mqۿ&ɥ7+u15G/EВrkHn}NopM/n9dLA)yZŻCeQ\q.;=aR[oN [i-D!52R*ujWp쎟ĂdfI3$ քF>tֵHrc[Jȸ7\(2Q8 [dx-"!R i {]!*/}fQݾEi訇U=JdzĻ碘Y1ƀWrE:M\k+Sq %w[x-7b7` 1Y]NcXlcz\fj`k`=J 8WqUc39JC-5_K\kDŽqxsNVkjE{mp54WqV[%~92ț.⸇2IUMVr͔!#DKD!W&Y*rsU/:k؋m+:UdCDJ5k}vD2FG D|4Ӳ}j2{+=U5̙T 2#J`r%x$:4 ^yV2Beݲ_AƋ?އBѬHDrr{cq&SWiO3߿zẼo0O7cFX( m\KYB'T. w YFP|>d f>I5:]]4GOcsxm<+K^̖ΐ4LR~eHN^!&U\E~|oĎ}S&muަ0xu f_&o5hnS$C'Gm涞U+O΍߷$}Y$xɴs$]<>Ϟg/3Ȍ͞>]]\$w7ɹ]MRzC3Gޟ95c]s"#0qܷ~B#5o3?̱>dWB/jWx@fv1R빥W\^5E.h#j 8u sI!v2D: bMAd".pYM-H[e'kùڋg7y"[붱VwqMfh.%K !}[8bsPsJANc5\$'A#I'/1(^' uָ)C1x+mO}uǝ^Z_ݍf0mI(!ׇ#SݦHi{l͐)DiY$Mh/` }{^뵩 ]YR'+@"Q9zCxEqx/H]ǧ"!j#^-:nXT f_`3,EGٰ%+lˆ5țQZs46dY-,rmN)J6Esk: }Dt^ʈ $p8 ݳ|BNhU;j5 abO~u#B8aGbB{U-j"JV{ U"$h2ӳݔʃQ*Xa#9A`*GUHM5VLUB-QV$vkӯ7n`KL/}KGQ{?gj=M+n,6xm;rE)ܴ#+8]LثW/ҷE$$#4In_Dvۚsd5OʉOgƣ?8;+"Kx+_P#0s1Դ lFy}t{H;hZvX+tBFB,\io`*h-P`#dF q"0Lq@vo5;Yh#YdnX~-4 VknZh4r┰BXqe[.2<W{`ד&Sk/k|Yo賢h'Cyt6y&0/ox{Ev"1 ϯ~̛*t%!eq~8l~I"HTF#$~\'HL $k'mؖؕ9)]rcorʑ($pGЅ G?[vv?z'99?{xϟ^t.S.E1HN)=f8q4h1Po8[.݈ꗰuaQCoK.n/KvbƮrǩ` UΪkځ~ﯿ5Bt}Ղ nGxzQ۶~nO5Tu=#8P"=0 5; `GC`׵J7b1nֺYG+uh2ǻh>O,e*NlIDAJKXWLel-*rOy/^c M=\+}_o: (Жwd!dSs v_"FrCu:-c@i7'HHA*W뚁`., 0 ,@sOAȒN h5pyF_68oqԘ,ᤖYC`b;$C-d`KX!ӀŝWƾ/_U(jc>er4/KN$\ B"[X1p+'DeS` +ojU4ď[(ѐicڟA$"IkFrsmLm LIS*2"s1ZJP7"m~6A P5zW/2Xc>6Nnh7 SSVf;Al;:zo+zQݴ&Wp{R%k.=-*Ҟ2c1_1ƹ:כ~Ҽ*DLr:Օґ1D Zwq eZCbo_K8aw_ "~lydh :"q* UT_9גjC(@#VCՇ0y:weq}XmG$j~%w%ĢjCNPڣsRp"N#O`%rHy@k<έ|7/m׭ooJci${w[/{sm9ΕhYX9Sǹr +Wba 7"J?*,YaK,,lDiFTQML,8Lp:ʌ6p/j{+b]akq݉=R38r@'U ";9^Syb3e tਠCا1DGq/"aֈNU=>PkfӐ+tx-^F[\~bEaH+<>T.|I;$7q2eǐ*`2c\h |G{&C~d['E%wڠ`X^0Xn;)t;MF俀fh}WjOc[ ;bg绖#@[`ύ5ۡOT4ĤmVr-kTx@aĂF1nLhq s._`ۺft%CrPv%P&?'e#zz-ӵ$)ZALI"GdL`Jc Jk44#\2А;$d6u(38\΅X .k.b$zmŨz#]u"|lplj?>LPC9GIj'ry||y&C'8nQ?;Y7 hZNjEz/dou釴MV2 3c& )!;U[2UO1@FKEQɥqb_M#hPâI-{0KRAeIt6'*$λiZ,Q,ېh%NIWUqv}ʠTQ*V{#V7~dȴ96r:E.h>_,=y$]Ng &)4}}̐>O󉢟0/ft0oZ`#eC O2ہMiL;jP%}RuGN`EtK6;NNyU+aX(S#ھ^_I嚼^ߪt;P GڸJ*ocWqD`rť|MC,ڱV3yJ_DLZD*|XdC{O7fA.E]k+_On׾J1%JܜG Ic'$ӧ?tԎAM,{/Q"ג(J]Bi"^+D:C8!tEv:DD߆(aOe)bۀV}t=3ݱ bQڡ U[L#O*E-u:`>@o% JBLrIbQ k&ӂK0pBxv1Y-DLEI]Zh,YE9mpJ !aJ/-FG].E>  M~ᒨbrUe g[ ד`[DWYA!ɔTFaZ]I}hl=;E#GXꛒU.S }[::mY 1z+杩3}G}h8g{U\ŷ+/ڙʼV^lT$}L< ws˒M98Xt0D9iwjT=ިTK!tv"g)b/[mZ{.K7ʅ$GN';9'շ?D16 D 컟"ŦZ*kƴ!c/pzc<2WrqPBZZͷbŃ]!Jq֪6oي:=6>W`}Fdp1,) Ⱥy*?~!p֚0QhEHF"GY}`ϴ(]Jͩ?n(^ vmXTTTfh`gϭw.X΀E'B׫|ZޭnS_o*)xR.& (>uL=wf?272rV`u\L\Qk-s;x1Dt2AbyoM2R7LcEv|8}e f9ak`z'8ԑc'Օg32/`sJ@M5y8.a%?Ek[=VsyinP^21uI֖dR2 MZ-5 `d wX Hd2qkИp8 Eed|P- @ mbh U ȢBu†!4g Qta)z֔CF]S6{1oXE.FxqɔٛmGY󉟦췴$ڦk¥l VUHi[hvqoյYƘkV Hֱ2rf2ՔҧcقZ+<76X~UHTr=>Qؿrr@P-2 +ڶ (Omv㗗o{gZܤK:فBa.ĢdŬYhfT{#3 'űS04',ʮ5^"d-kQnS_rbpc?zSQ[R4z.zUuY(Y\}N[[֝@߯ ̂$*D*Agl@3֣Cdw+.si23a1yO0E]z j,fkP8aKf?5zWW8"N ƴnPW}M 2DKk VQِKkavq،iRlO;|5ب+s '.*J lMH _>K-꺟Sz>/[⽾]t(Cҥ\P+@l9ءsե@ڝ?DhGCTuREرǸhA<zo' _ʆRDVUR2V͂^Z'cUcW Q}ƚq./>Kkeqz/D]kሧ]~ v6LjCC{dr*ƷR0$wr+G<5#mmԩ첕'ph4ŚLŊ@fӪF.kcuN BK?o! Z 6gN#+A\Zݠ Dk5l}a܉'?ΌZk<;H]O׹ޱ񍖎jBp=jš(Se ؿxv~xZNp \0 m!˨ : ͬ n6)oPE>S)Zd[ᔳ4>!2t{\|tze`7K5P7l JU]Q$X|8۳,ޏYsMQ,D#((yq8¤k#  YLh20DOXqc.׻rn׷_aeT^͆`{5Y05E`8} D2ʫ^43|(q+Z#D+1 ~tp5Ňs!P;P怓Ui/?:5 fo?gBi 2Wq9A[5ZDۋ #D\pcM; :,6֟^xqO6do4@X/W64ŘQ_FgMC)s8 x0cɞu<2KR =Qx}by;̧=O,ϣ>xn;dxQ?T#9`-w=4rG%guBkΐly{J|Og%ie\$)6Gym4{fxZPVMixMZw@ܳ|D2.Uu.]{?_ x] V7rlW|"!?m=;o"g3ӓ;Lip\īO|;>?1ھ:\񤮛J9ߪz7u{7ۛ]i^Ȝi߼2rFEW@lX2'Mi׶p{%'ڠh{h;F㞶n+oͿ6=ClF3W^TjNEn 3 ̿ fbn_ aըc7zJnp~tG@f,}c!.@ɼ\O?zukSdSFυQ[FHCX6pPř~ ;Q6;g EO{x+L>B-NQAGBPXq@4h&#MC4rt aH][-#:>tLjB%,‚}(mJ@{"֪ N@曊>Ff,$FOq_RǤe@"!UE灍$3{,<ާ."=;͒ .c?e1b9f@ {-^&cQ wH/42 ( X_mӡ|?$RPKZuV~EBwM x S~ >x)/cm2 ;,Ok ~ f2)/N.%].(J'H{hZ%뚉N M:^~tjǼdv32Q'1}E]t7|&m#nvjCU~HYڔKbWg>XsJ)&)Io@I%HʱRYAc[|쭌5B5&g'E1wfcg<(-P(5XGn4UO2A էy?(*(ÌcSm2NC!HTt$#Nn3l r~')r1YD"[.IAi:ׯ&ܚFSJA@Aԑ|$p..C(<Ӷ$] .X}w^'Įi.G7,mX2S5E Â6Ӟ-R J Q }tN:5Ɍ!Ca 7%|ѶkϖП=uwV;닽}5dGp,u{Af-ټe=;sԁ ٳ' ,Xiȏ}@ģ:CO>~:Jg o2`&(UoIZ$^xEpWߝ=Ĩ|\ǙM3xt"=ٻefՄRiCJ&P)@wffS' 3N*_|S(Ure1QšVE$D.MC=D:SY \ }yۏf҉UgQ zjˏ:sy#`3k:i?g&c.L]>Rʢ99D*zjTwkν/HxPy#;\+5>}=K'~? M'Ş!p[Ԥ^Z.1 Z(|Rtdq1D*E`QѯHjyDC8)U剌|uIctHzJHXb9'"R*Ԫ"Ԫƥ\ j㚥S7^qk=3 m|BPKlMXVt(GkJ8?Jp\G4UIЪօdK}b 3vy&BŒP^\*:t6z5 ^g yYN}L''[uN9| r<~ %#^j]?ȤU9Z<):qyIɦGmƺGpXKʥ AT xd>`׀եj&7/ȑݪs3L?\0Ԫ(|˱kBobfmyE&wzQ3K#C#9|j*[iЈ )ŒءXXRsȒQXg x叆%wǚ0h) AKx1&G68e{+';^ߞx%{7SYbm MD dA`"ۼ62U޳؞ %9v5PEڥ|lKZ̖WbL_48>꧇jYCxe*7CG[aPP|^H0K+|Ǣvr ip,h4vU)&p-^S 6:\4 6`t[ gjjQ=\R@v>['mX4h38wG%6m,K-EB;BS3'W*0{_ ?bjpPOz*-I?hAﹹ^ͥ1nTW,G0.A4GP vCN#TC(*߀kNy).2d0D8$^pba{(vuX#AQ!9kHRUc98iLUFL c!4, #X|plwV艰{*3пus{x`2h|bBMsLk+ˆ#;MؼYp:휧3 /:,M+T2n |㗯HG"Uz|0ZU՝$,8`3 uN*c"#\ ({ўBkY}+MW**I5t-QW$a=hX$^qI1*\ Mt u5x45Q"p9*Y34 gM5F|RI%Ju1xjVG1%()[G7LKOt4%cqeZX 4!֔-7uZL里# $9 }i<.vv%,|+^<)!jsr~vyC@Id[{> G RjK`f-#]P*.8:nXQ8eT;Wiے vf69y$ S^ʼnи/hJmzpR) 8HLwegPwC+Ҏ[{jaOW7:~f/\Ǝ}sikP4Oc2wiF~MrAgs1Gu#BabdMe+ fzX^9WLvJ(3.E8L*3 +3X01EkYPJZo\#E!g}&e $V%3A>{sYjuRlT@ּMllfS}دRЊV~UY[C| 2Fm)QZi)#缊T()=wtu ΄ο6CӒ5P'!gA) vcNކ-ͦ'yXHJۢwj/2OŶG;J 7o??t}qqqqӆsd\M 8,(֘U%iZa,hA()M4FAGJ', \۫BW6v~,'Yi\VV)P@5nj+!ɭ#TN"x"|Ԣ "B%6QDZH ̠d6x$_HTJьФmr4R !X-= HK1yO"#qW3k=Z+2}w+b"d8Ab,^3ǤA``@E`JIT)g OERBHHl.^EaĄB&;5ࢅlZZԂΗ&Z.&Pa/3FpP\tɆCCW`W$fs<6bYPuŻHa)e!:+SV,(6R-?2|kТ0 0G.$2vg‘,^Ώ,avNxAijDq'Z$ɂQ+p[1*(jѧt1B cLGy!Ϧc$#a2"rQVj@Ij%C7U^1ʗ*X zdʶ\l׮ċkb3BM5*թ 1ק.4 |oh|uFhHcCYǨjb۰U:#9{n=Wcrq\[zRVžx5`K֊g^%Wq@]^8f, QFM2J%x0Ib\& uX͙@ҳ˻Ӌ1BK<m`Z9detʩ.>FIfY6 ޯEH,b(\ z8շ1J 0ZiӅ8.j%(S96#uV5]5\ -:B;Ү@*akWVr+s&i? d\ár=9l0)ͷW/ ~).ƒȌ$ 0\'S=1:e i'kr֚` :sT[ʀ4Ŗ5G6Ԟq ٲJ9,}pcX =qDi2*SF5Lzj+QR h+j?>_)e !%9a i)Ze`y}3iے n˩0Cn,kWf|Ub(rjF}egDppf&o8};ɓO,rN",AՆJ of{rP xxBape#Z뵞UDv?Yr$ԕKۛӉ ~yJipp9M) *.׍l]E?!4vvMtB{'˒>Ð|].>P:0ԄAMgPҡswtO7x/L7g܈#?dGAlB/OxgZKGRN]ݜu )jPi/Pgݥl8 ߿lD0-;!W#Nӗ׿-`OgcXfscَDbR*py[lּnBmXjcRmkMG@px Ta:vj oI7jX_ܧ8U2F+ tM 寒856Y^Qq[b9kRrܚOR1hmMZJQQUƻ(LZJ9ˆ1 \.Ux)/L0 JdB`@uL3MMi絘mE `eJ!#;sَD('Ǭ /l1Hs;vS"W7b5"{:x|^uq-"^[iTS`}ouLJhU8}G.mEZT M'8rMBrxҕN̩}NYQboˉԢBfRgd ֕)HOZ GAH8e\ |˨ +cGP|3:Or8PLއ쇣N{N3&l4pLIQܜj.KfT;ldu)ǫѲ,\<h1,R@16&RP ,1&b2'!qY%RpJSNS<9)J6v@*`ŘOl2ɂʒM(C˂s\Zŝ+ʧD es`-6?Ϗh󟖙w'ƴŬ^> ̷{{^xx<]nng^6 Tc [/pG'R>Ϧ-^浦r\7JUr9x1~tM#>OHN4ю2 $b%/{ &@3+-uAJ&2$)<MaBJCCWxwمFf4&~‰.Dga)!hl-XVI4j9͑N"AF""uEHT&y_qk2m|YM1}3_Wr");IrI+OFj6 Q Ϲ[LB<יyBbߚSkB*@?9ET䖋D|1*W#-Yd/Opٗ"{M5O " -Aa=R&s&Gv|ǐY҅E~5B {S;)Tx9-X));*P<'#¤@L@$f)zc <*V2'/{MoNB;- ݞ"ThF\p.uRԺ&&"ɟȡѠ ͧ k̇:,T^ELQ&G#Aax:GWׂĝRV;˽Sa Rw602R'`-g&P)u!u"mEtFNfK?oai߮Jrou7 ϟ7\LՕa+Mh/t?M:tB& <+L..y^l==xvhќNr_s|p-="tehrL*QAJ5B9fTIFQ2pGr79km2FEwM78-^=J0Rt9|()[ϗ jIּʮ*˗ڊ\S9țƱȲߒw}!9쯰K7%u}Icŵ% 2z)ilDc+Cm67=ij)OrGT5V}wN^䮌VIB޹v}/lP U5m.Y0S.C{;dޭ y&dS \^_+Hby}2 Zc_乚!sH#bκR78|lRZ'և .p6Ս̰2S >n}&ذ9jA2_FaeaI6y H6N7_նBR|^GGwgShwM(fk@;C_鿼s>BD]snv)kN3 820㻒_/ЬUrS]fnzo#5F&~#mo#ő B`p\D#MVH]P1w)u0®,t+C*N&]Jd/ܓ0\Wϭy˕P)֚u0wNDRzͷ0ܚ& uFt\S'*J*sçeY7kTKB:xD}o,0[h.c/&$,Sٌϙ+ela'|Hib~At枠CLօ p!VV}Kꊈf2 $hTU,L.$:3)W=7yjg%btiBt8E[C= 02`Ƈ8IJ؉_껂W=.N=ǰt^I45ﱜ(Nf1&Czd-L5$z 3f@e@ wSqZ/NaøwBWqto h{|h6yy3Z qn4){'+{{]Zr/cdn _￶ow|܏0;&/WQqnS߹%Z'gT.S\Pm  x)K϶(L.;*:l"8T}שxO-=~%$â-fUέǛG7J<9&KK0P%0;NG}Q7t|9)H<+hdNAVG_1oLJο>MC~s?'_U K9?l]vl^cU.lWINWuUmF?T.(H#9r%'hR% d(LJPFt+- Ye7v /dӃmu3^-W*䗏6hG Ơo`8{@0~^p0J1XԻ;fKκ2CR٘Z@;P𦈀֓AD<萊w^0"P8#ۼce'ٖHw%c,2X Iyr4Jޠ)[l-:Dӻd*YM * Dnp1zi)WIXL2ElO4o˨6MkI;Fjvm:S2-3>6cˬ>'c:ZSQTXk[U38;Y8˃1YX)HWC jWm+Կh>y&W=cr7jtp(u#{Z׷إK!Kٌ]FB([Q`7L|q~%~o(hKy^Cwv֙j媾eڴocbD.䤀XH֨Yzv9E6HXEPeÿ!w^Mm@o9?|y}~^z :HE Q?/q/^d㸅Whm]XUT_|xYj0n+>|Gb/V7VYln_>ו6T>Z_?DvW ؼF/7j^c;#7-}ƹgN)Í:p[O;D W "IgDMH~D(#dN)rJ,Xbʖ7`O "bLv;gT[utajRH/U8>7>b'ôЇDvaB.KiY ;˾${ 浨1DSddȺI٣ںnUkN{ h +/@8<Ź1T9Bi{s<Jaj|3;p&sE z߂/-ӛ-6׸قmzu vML[4 h[,GHCę0!jMnXémdŞ #S4 }-゙ߩlY8K2`Hpbo@ع16|E,O':.JyF; l<+^5>ۦZ[<<5luP\n |puy.\~^dXp`>+!wHQlG*TtWѺ^HFj]cXB{j#I{]"HL8`6QƧ$ h3j'PT)Eg 9d;gR.)af6dbR"sbb9_"!9זʦ}NZ@hGnWD1bާR/uE$p4&QEl6Ս S!%,ŢhkRETR=V91dEeA Y3wxZK5ҮFf:9%R\`/uZ%.A1]Jdz]9KBhRxgIu^it1Ƥ@cHR$/CtIS $ĀEJTT, W2BXW=W@ZDٕ3`r1S&d#7jmwfc^':*.@8Tx3-0ވ4EejxG@Ԍ sMf2~p!JzFlV {KyZ܈_&{ uFɳg+9N4;= -#He.TZu333gTa9>BcDXx;7 ¿q~)m72)̻_pơS.O=7 ~`+k\۷YxbF'N/᧶sҁuV'zL Oh3OD 57Zc%EIMy H{]]!f>6tnyv% +JCb6 l N[5џ~,ʃcj&ULogwHӲyWS#}0:O<:bY$ˣwzPN,$+/~R:&Zqww.c` A.y s#;؏`^ `14jCArYyuh$XZy'U[kځlW'4s1x>XP$&g_X**ms=kWNBdb(#$ Dl!R& .#8T)ғcs^]]X{y-e9Zy$}Z;W*3<^|{n6E^o՝w_@{?!Oo?8)/.6O^/_vOV[9Wti͇|%>*CXEn~}xnUF0`/H?ҽcb]"deI^˅)}igG"p>'tޛEG- t;9jd@ -Cvj-Nm9a~QP3o/?5eNφt,+g%eb,ԡKJa@G2xW/[Ud0z4˂\uOCTmS_}g}'aQ}M[igKz g|~KD[{[*')n"k.D5SHuD:ʋP釛pos̓43kCOh(йkZz㸑_1r ZV`YXs3q6V-Jv2Yt-ݢ.փ],\6[rymf-' [u)VC\q=>.  « T._ #s AZ"ʮT5$JbrZڴOLU:9CA5NgǜdfA; $-%xv0sf %f,5F?zp#ęc/Qr&irn=4'w^[)OZ)_8"g922br&~U} Xf9Y,S9<>?M8+R>-תr:p~: A[Ո g%]>$hN;Êw\Vޤ kY^6$@ 9A1dpw6Z5ӫG;zq 6F~ i|߿}m>WioLy?/û_V;l Ja_b.@Aߴz, ƙ|``%|C+:w' -[Gq!XU]~ Cܙu2_ ;:y/kn+ͱ*Q_C;w?㣬6?؊b^uq }v*1=>=]zi^ۧ?1 OSz؄72l5FU71c89.;0_{Os΋ίR"PcztC(!nQy/k׫ݫӏk>,,e޺źH(/_,QK3 P#?.Ϭe|[@.`Wir(TŜT,MXNGmAĽUZ(a +i?gǹV}S|͚iTǫ@JNC .6EJE%X5sd7cMRтJWFęҫ7L=p,5c5}]9q>s^ r'}C}_&h$ĠIXc/Q^YZNZ l!9k~do<aq1 [E y^UY a(= ?` _,0N ,O=rʋ{n.9g&var/home/core/zuul-output/logs/kubelet.log0000644000000000000000005362105115157251721017706 0ustar rootrootMar 20 13:30:19 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:30:19 crc restorecon[4698]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:19 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:30:20 crc restorecon[4698]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:30:20 crc kubenswrapper[4755]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.923180 4755 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934051 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934110 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934121 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934131 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934141 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934150 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934183 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934194 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934202 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934210 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934218 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934225 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934237 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934248 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934257 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934268 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934280 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934293 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934305 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934317 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934327 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934335 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934344 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934352 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934360 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934367 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934375 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934383 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934391 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934399 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934410 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934419 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934428 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934438 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934446 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934454 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934461 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934469 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934477 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934502 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934510 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934518 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934527 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934535 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934543 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934551 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934560 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934567 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934575 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934584 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934591 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934598 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934608 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934618 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934627 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934634 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934642 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934685 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934693 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934701 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934708 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934716 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934723 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934731 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934738 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934746 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934754 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934763 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934770 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934783 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.934797 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935778 4755 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935806 4755 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935822 4755 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935835 4755 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935848 4755 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935858 4755 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935871 4755 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935883 4755 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935893 4755 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935903 4755 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935913 4755 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935923 4755 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935933 4755 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935942 4755 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935952 4755 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935961 4755 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935970 4755 flags.go:64] FLAG: --cloud-config="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935979 4755 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.935988 4755 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936003 4755 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936012 4755 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936022 4755 flags.go:64] FLAG: --config-dir="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936033 4755 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936044 4755 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936056 4755 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936066 4755 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936076 4755 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936085 4755 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936095 4755 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936104 4755 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936114 4755 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936125 4755 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936136 4755 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936148 4755 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936157 4755 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936166 4755 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936176 4755 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936185 4755 flags.go:64] FLAG: --enable-server="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936195 4755 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936945 4755 flags.go:64] FLAG: --event-burst="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936957 4755 flags.go:64] FLAG: --event-qps="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936968 4755 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936978 4755 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.936988 4755 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937001 4755 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937011 4755 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937021 4755 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937031 4755 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937040 4755 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937050 4755 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937059 4755 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937069 4755 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937079 4755 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937088 4755 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937097 4755 flags.go:64] FLAG: --feature-gates="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937109 4755 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937118 4755 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937129 4755 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937138 4755 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937148 4755 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937157 4755 flags.go:64] FLAG: --help="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937167 4755 flags.go:64] FLAG: --hostname-override="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937177 4755 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937188 4755 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937198 4755 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937208 4755 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937217 4755 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937226 4755 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937238 4755 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937250 4755 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937261 4755 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937283 4755 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937304 4755 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937317 4755 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937329 4755 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937340 4755 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937352 4755 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937365 4755 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937376 4755 flags.go:64] FLAG: --lock-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937386 4755 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937398 4755 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937410 4755 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937428 4755 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937440 4755 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937451 4755 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937463 4755 flags.go:64] FLAG: --logging-format="text" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937475 4755 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937488 4755 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937499 4755 flags.go:64] FLAG: --manifest-url="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937510 4755 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937523 4755 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937533 4755 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937545 4755 flags.go:64] FLAG: --max-pods="110" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937555 4755 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937565 4755 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937574 4755 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937596 4755 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937606 4755 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937616 4755 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937625 4755 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937684 4755 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937697 4755 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937710 4755 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937721 4755 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937737 4755 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937757 4755 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937768 4755 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937779 4755 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937788 4755 flags.go:64] FLAG: --port="10250" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937798 4755 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937807 4755 flags.go:64] FLAG: --provider-id="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937815 4755 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937825 4755 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937835 4755 flags.go:64] FLAG: --register-node="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937844 4755 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937853 4755 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937869 4755 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937878 4755 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937887 4755 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937896 4755 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937908 4755 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937918 4755 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937928 4755 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937938 4755 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937949 4755 flags.go:64] FLAG: --runonce="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937958 4755 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937970 4755 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937980 4755 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.937989 4755 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938001 4755 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938011 4755 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938021 4755 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938030 4755 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938039 4755 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938049 4755 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938058 4755 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938067 4755 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938076 4755 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938086 4755 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938095 4755 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938111 4755 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938121 4755 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938132 4755 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938161 4755 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938175 4755 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938188 4755 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938199 4755 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938210 4755 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938222 4755 flags.go:64] FLAG: --v="2" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938238 4755 flags.go:64] FLAG: --version="false" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938253 4755 flags.go:64] FLAG: --vmodule="" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938267 4755 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.938280 4755 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938517 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938530 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938542 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938553 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938562 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938570 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938578 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938587 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938596 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938603 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938611 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938618 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938626 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938634 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938642 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938690 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938701 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938721 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938737 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938747 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938756 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938766 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938775 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938788 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938799 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938817 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938826 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938839 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938852 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938865 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938877 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938887 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938898 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938909 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938917 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938926 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938934 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938942 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938950 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938958 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938966 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938977 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938987 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.938995 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939005 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939013 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939022 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939072 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939080 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939090 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939098 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939105 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939113 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939121 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939129 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939136 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939144 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939155 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939163 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939173 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939181 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939188 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939196 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939203 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939211 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939222 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939231 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939239 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939247 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939255 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.939263 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.939278 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.960497 4755 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.960563 4755 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960757 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960775 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960786 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960798 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960809 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960819 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960829 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960840 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960851 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960862 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960872 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960882 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960892 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960901 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960912 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960925 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960944 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960958 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960970 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960983 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.960994 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961009 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961021 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961032 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961043 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961054 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961064 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961074 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961083 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961094 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961104 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961114 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961124 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961134 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961146 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961157 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961167 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961176 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961190 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961203 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961216 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961226 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961238 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961249 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961259 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961270 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961281 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961291 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961301 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961311 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961322 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961332 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961343 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961357 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961368 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961377 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961387 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961397 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961407 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961416 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961426 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961436 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961444 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961454 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961464 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961473 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961482 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961491 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961500 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961510 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961522 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.961539 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961870 4755 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961890 4755 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961900 4755 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961910 4755 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961920 4755 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961930 4755 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961941 4755 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961951 4755 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961962 4755 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961972 4755 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961983 4755 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.961993 4755 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962002 4755 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962013 4755 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962025 4755 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962036 4755 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962046 4755 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962056 4755 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962069 4755 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962083 4755 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962094 4755 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962104 4755 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962114 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962124 4755 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962134 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962144 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962154 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962164 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962175 4755 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962184 4755 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962194 4755 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962205 4755 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962215 4755 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962224 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962236 4755 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962246 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962257 4755 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962268 4755 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962277 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962288 4755 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962298 4755 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962309 4755 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962319 4755 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962333 4755 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962347 4755 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962359 4755 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962371 4755 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962382 4755 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962394 4755 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962404 4755 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962414 4755 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962427 4755 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962439 4755 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962450 4755 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962462 4755 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962473 4755 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962484 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962494 4755 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962504 4755 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962513 4755 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962523 4755 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962533 4755 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962543 4755 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962556 4755 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962569 4755 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962580 4755 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962591 4755 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962601 4755 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962611 4755 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962621 4755 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:30:20 crc kubenswrapper[4755]: W0320 13:30:20.962633 4755 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.962682 4755 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.964400 4755 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:30:20 crc kubenswrapper[4755]: E0320 13:30:20.970044 4755 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.977174 4755 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.977349 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.979133 4755 server.go:997] "Starting client certificate rotation" Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.979173 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:30:20 crc kubenswrapper[4755]: I0320 13:30:20.980597 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.010435 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.014244 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.018964 4755 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.039318 4755 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.078523 4755 log.go:25] "Validated CRI v1 image API" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.082253 4755 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.090901 4755 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-24-34-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.090962 4755 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.121909 4755 manager.go:217] Machine: {Timestamp:2026-03-20 13:30:21.116811991 +0000 UTC m=+0.714744580 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ec91ed1b-a6ed-4cb2-884d-632a869fcc2d BootID:382501ad-cb22-4ccb-a572-771d7a82be1e Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:25:d0:34 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:25:d0:34 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:67:3a:c7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:b6:07:44 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:a0:4d:b7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c9:38:ec Speed:-1 Mtu:1496} {Name:eth10 MacAddress:9a:dc:16:6b:b6:6c Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:36:a1:c7:80:b2:1e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.122324 4755 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.122534 4755 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123027 4755 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123397 4755 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123457 4755 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123883 4755 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.123905 4755 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124589 4755 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124648 4755 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.124950 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.125100 4755 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130023 4755 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130065 4755 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130112 4755 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130138 4755 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.130160 4755 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.135785 4755 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.141047 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.141259 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.141561 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.141641 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.145954 4755 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.149590 4755 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151590 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151719 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151789 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151845 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151899 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.151949 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152013 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152070 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152124 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152187 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152244 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.152300 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.155919 4755 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.156816 4755 server.go:1280] "Started kubelet" Mar 20 13:30:21 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159729 4755 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159736 4755 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.159985 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160030 4755 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.160505 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160583 4755 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160635 4755 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.160689 4755 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.161386 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="200ms" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.161439 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.161541 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.161767 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.161777 4755 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162002 4755 factory.go:55] Registering systemd factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162101 4755 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162101 4755 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162382 4755 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162745 4755 factory.go:153] Registering CRI-O factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.162904 4755 factory.go:221] Registration of the crio container factory successfully Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.163043 4755 factory.go:103] Registering Raw factory Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.163206 4755 manager.go:1196] Started watching for new ooms in manager Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.164017 4755 manager.go:319] Starting recovery of all containers Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.171077 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180786 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180899 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.180940 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.181258 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.181994 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182168 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182321 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182395 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182476 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182503 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182538 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182568 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182636 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182767 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182849 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182897 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.182936 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183047 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183190 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183265 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183346 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183443 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183479 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183522 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183554 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183611 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183645 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183737 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183764 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183802 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183828 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183860 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183903 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183925 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183955 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.183977 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184012 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184037 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184066 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184114 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184141 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184172 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184196 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184223 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184269 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184299 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184327 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184349 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184375 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184442 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184484 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184520 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184546 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.184581 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185409 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185461 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185477 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185494 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185506 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185519 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185530 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185542 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185554 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185569 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185581 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185591 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185603 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185615 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185628 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185676 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185692 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185707 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185722 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185744 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185757 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185782 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185797 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185814 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185831 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185848 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.185868 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186274 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186312 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186328 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186345 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186362 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186377 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186389 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186420 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186432 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186448 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186473 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186488 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186503 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186519 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186532 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186546 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186561 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186576 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186591 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186621 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186644 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186684 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186704 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186722 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186739 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186754 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186770 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186793 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186844 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186861 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186878 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186896 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186949 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186964 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186978 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.186992 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187007 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187021 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187038 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187052 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187066 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187113 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187128 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187142 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187156 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187173 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187190 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187209 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187227 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187248 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187263 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187404 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187424 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187439 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187454 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187474 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187512 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187528 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187543 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187559 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187577 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187592 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187607 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187627 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187642 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187721 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187739 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187757 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187773 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187790 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187806 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187823 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187840 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187856 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187872 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187886 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187901 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187916 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187937 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187956 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.187979 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188026 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188044 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188059 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188075 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188092 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188107 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188122 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188138 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188156 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188177 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188194 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188211 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188227 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188243 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188262 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188279 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188297 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188314 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188331 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188346 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188365 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188382 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188396 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188415 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188428 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188445 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188461 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188476 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188512 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188528 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188548 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188565 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188582 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188597 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188613 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.188628 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192352 4755 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192493 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192599 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192707 4755 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192787 4755 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.192853 4755 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.198767 4755 manager.go:324] Recovery completed Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.211919 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.214963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.215020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.215040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217441 4755 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217475 4755 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.217893 4755 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.221316 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224272 4755 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224338 4755 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.224367 4755 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.224543 4755 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.225420 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.225553 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.236702 4755 policy_none.go:49] "None policy: Start" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.237845 4755 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.237886 4755 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.261372 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.294506 4755 manager.go:334] "Starting Device Plugin manager" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295016 4755 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295056 4755 server.go:79] "Starting device plugin registration server" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295907 4755 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.295949 4755 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296286 4755 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296440 4755 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.296463 4755 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.312880 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.325630 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.325771 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327367 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.327684 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328412 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.328421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329513 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.329775 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.330696 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.330771 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331155 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331363 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331463 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.331504 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.332960 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333318 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333568 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333693 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.333929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335054 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335294 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335347 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335693 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335761 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.335964 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.336842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.363506 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="400ms" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396401 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396649 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396754 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396826 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396896 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.396994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397037 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397435 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397510 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.397992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.398015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.398058 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.398691 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499499 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499615 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499691 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499743 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499876 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499992 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499777 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.499958 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500072 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500210 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500352 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500395 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500458 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.500559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.599108 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.601308 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.602000 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.653969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.660642 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.678244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.698617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: I0320 13:30:21.707749 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.723370 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43 WatchSource:0}: Error finding container 64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43: Status 404 returned error can't find the container with id 64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.730227 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a WatchSource:0}: Error finding container 9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a: Status 404 returned error can't find the container with id 9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.737298 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14 WatchSource:0}: Error finding container 5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14: Status 404 returned error can't find the container with id 5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.745291 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75 WatchSource:0}: Error finding container 57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75: Status 404 returned error can't find the container with id 57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75 Mar 20 13:30:21 crc kubenswrapper[4755]: W0320 13:30:21.748823 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7 WatchSource:0}: Error finding container f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7: Status 404 returned error can't find the container with id f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7 Mar 20 13:30:21 crc kubenswrapper[4755]: E0320 13:30:21.764719 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="800ms" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.002928 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.004991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.005026 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.005635 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.043220 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.043363 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.163380 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.229808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"64fba2050dd472427600ff8d3cb5ad96cfd31bc1e6cc665add1169d04ff72f43"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.230939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9e2e2261e6d1f15f4bf1b331b9d4a7a2098c954a7532facfb80a608f65712c4a"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.232012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f43f38e05bb77ec402d05e0f95651a73c044a536785a47cd2cb4a55f126476e7"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.233251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"57fcc451c2d6113f6fca6e6aaf6f826c6cddb6246e87b8dac5c3d93ee13edd75"} Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.234088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5b9b8b38ff132dfc06ebb2dce03f550ca34919d8e18a7e138856fe7d9d05be14"} Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.381818 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.381910 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.566056 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="1.6s" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.568739 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.568831 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: W0320 13:30:22.623587 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.623985 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.806095 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:22 crc kubenswrapper[4755]: I0320 13:30:22.808990 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:22 crc kubenswrapper[4755]: E0320 13:30:22.809600 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.055498 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:23 crc kubenswrapper[4755]: E0320 13:30:23.056977 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.163109 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.240053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.242764 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.242917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.243014 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.245630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.248331 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.250955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251827 4755 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251906 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.251934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252562 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252585 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.252596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.253864 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.253933 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.254041 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.255809 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257017 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13" exitCode=0 Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13"} Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.257144 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:23 crc kubenswrapper[4755]: I0320 13:30:23.258362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.163272 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.167333 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="3.2s" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.262250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264770 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.264906 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.265703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267563 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97" exitCode=0 Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267620 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.267827 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.269270 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.270927 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.270927 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.271754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274149 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274174 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.274250 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.276761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.410253 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411663 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:24 crc kubenswrapper[4755]: I0320 13:30:24.411812 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.412645 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.181:6443: connect: connection refused" node="crc" Mar 20 13:30:24 crc kubenswrapper[4755]: W0320 13:30:24.559021 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:24 crc kubenswrapper[4755]: E0320 13:30:24.559138 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:25 crc kubenswrapper[4755]: W0320 13:30:25.057731 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:25 crc kubenswrapper[4755]: E0320 13:30:25.057892 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.163040 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.282203 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786"} Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.283195 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284812 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.284916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287151 4755 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c" exitCode=0 Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287319 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287368 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287325 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287448 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287640 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c"} Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.287858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288747 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288806 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.288973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.289762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.290539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.290607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.523747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:25 crc kubenswrapper[4755]: I0320 13:30:25.740898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.293915 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.293970 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294547 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294601 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294636 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251"} Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294698 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.294834 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295073 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295094 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295438 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.295448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296039 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:26 crc kubenswrapper[4755]: I0320 13:30:26.296047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4"} Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305802 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305920 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.305948 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.308620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311173 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311230 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.311250 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.317775 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.476068 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.476330 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.478165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.613195 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:27 crc kubenswrapper[4755]: I0320 13:30:27.615504 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.308249 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.309767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.741251 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.741370 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.745465 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.745711 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:28 crc kubenswrapper[4755]: I0320 13:30:28.747083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.795825 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.796150 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797636 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:29 crc kubenswrapper[4755]: I0320 13:30:29.797674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.530708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.530937 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532602 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:30 crc kubenswrapper[4755]: I0320 13:30:30.532751 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:31 crc kubenswrapper[4755]: E0320 13:30:31.313051 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.225759 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.226082 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229853 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.229882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.236331 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.318774 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.320711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.327350 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:32 crc kubenswrapper[4755]: I0320 13:30:32.962489 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.322078 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:33 crc kubenswrapper[4755]: I0320 13:30:33.323223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.324865 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:34 crc kubenswrapper[4755]: I0320 13:30:34.326865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.214967 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.215074 4755 trace.go:236] Trace[655135621]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:30:25.213) (total time: 10002ms): Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[655135621]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:30:35.214) Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[655135621]: [10.002005968s] [10.002005968s] END Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.215103 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.728804 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.729605 4755 trace.go:236] Trace[410276888]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:30:25.725) (total time: 10004ms): Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[410276888]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (13:30:35.728) Mar 20 13:30:35 crc kubenswrapper[4755]: Trace[410276888]: [10.004023796s] [10.004023796s] END Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.729870 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.734568 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.736189 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.738193 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.738322 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.738275 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:35 crc kubenswrapper[4755]: W0320 13:30:35.739633 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.739740 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: E0320 13:30:35.742039 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.746726 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:35Z is after 2026-02-23T05:33:13Z Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.751744 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.751827 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.756696 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:30:35 crc kubenswrapper[4755]: I0320 13:30:35.756765 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.168346 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:36Z is after 2026-02-23T05:33:13Z Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.332325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334314 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" exitCode=255 Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786"} Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.334541 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.335417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.336027 4755 scope.go:117] "RemoveContainer" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.463917 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.464216 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465717 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.465786 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.516208 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:30:36 crc kubenswrapper[4755]: I0320 13:30:36.968421 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.166595 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:37Z is after 2026-02-23T05:33:13Z Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.340152 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.342097 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343412 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163"} Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.343989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.344490 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:37 crc kubenswrapper[4755]: I0320 13:30:37.363990 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.167702 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:38Z is after 2026-02-23T05:33:13Z Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.349010 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.350048 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352754 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" exitCode=255 Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163"} Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352933 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352967 4755 scope.go:117] "RemoveContainer" containerID="d6918c6eb2d4770ee12db31dcd75865882104c00eb8c8c1322fc1eaf16ef1786" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.352934 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354103 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354152 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354376 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354416 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.354434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.355098 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:38 crc kubenswrapper[4755]: E0320 13:30:38.355356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.742960 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.743196 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:38 crc kubenswrapper[4755]: I0320 13:30:38.746228 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.166460 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:39Z is after 2026-02-23T05:33:13Z Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.358601 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.360942 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.362351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.363443 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:39 crc kubenswrapper[4755]: E0320 13:30:39.363800 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:39 crc kubenswrapper[4755]: I0320 13:30:39.805043 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:40 crc kubenswrapper[4755]: W0320 13:30:40.041769 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.041885 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.165425 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.363758 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365002 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.365116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.366138 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.366520 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:40 crc kubenswrapper[4755]: I0320 13:30:40.371431 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:40 crc kubenswrapper[4755]: W0320 13:30:40.773914 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z Mar 20 13:30:40 crc kubenswrapper[4755]: E0320 13:30:40.774081 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:40Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.165757 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:41Z is after 2026-02-23T05:33:13Z Mar 20 13:30:41 crc kubenswrapper[4755]: E0320 13:30:41.313930 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.366290 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367852 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.367995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:41 crc kubenswrapper[4755]: I0320 13:30:41.368912 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:41 crc kubenswrapper[4755]: E0320 13:30:41.369259 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.136719 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.137901 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.138124 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.144324 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.165614 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:42Z is after 2026-02-23T05:33:13Z Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.368559 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.369748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:42 crc kubenswrapper[4755]: I0320 13:30:42.370321 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:42 crc kubenswrapper[4755]: E0320 13:30:42.370509 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:43 crc kubenswrapper[4755]: I0320 13:30:43.168949 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:43Z is after 2026-02-23T05:33:13Z Mar 20 13:30:44 crc kubenswrapper[4755]: I0320 13:30:44.166296 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:44Z is after 2026-02-23T05:33:13Z Mar 20 13:30:44 crc kubenswrapper[4755]: I0320 13:30:44.253092 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:30:44 crc kubenswrapper[4755]: E0320 13:30:44.256631 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:44Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:45 crc kubenswrapper[4755]: I0320 13:30:45.166976 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:45Z is after 2026-02-23T05:33:13Z Mar 20 13:30:45 crc kubenswrapper[4755]: E0320 13:30:45.742258 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:45Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.167799 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: W0320 13:30:46.608097 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.608229 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:46 crc kubenswrapper[4755]: W0320 13:30:46.698726 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.698829 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:46Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.967858 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.968075 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.969472 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:46 crc kubenswrapper[4755]: I0320 13:30:46.970033 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:46 crc kubenswrapper[4755]: E0320 13:30:46.970199 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:30:47 crc kubenswrapper[4755]: I0320 13:30:47.165384 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z Mar 20 13:30:47 crc kubenswrapper[4755]: W0320 13:30:47.314282 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z Mar 20 13:30:47 crc kubenswrapper[4755]: E0320 13:30:47.314405 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.164756 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:48Z is after 2026-02-23T05:33:13Z Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.742082 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.742842 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.743124 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.743460 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.745430 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.746262 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:30:48 crc kubenswrapper[4755]: I0320 13:30:48.746615 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" gracePeriod=30 Mar 20 13:30:49 crc kubenswrapper[4755]: W0320 13:30:49.124718 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.124879 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.143777 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.144704 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.146971 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:49 crc kubenswrapper[4755]: E0320 13:30:49.152703 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.169709 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:49Z is after 2026-02-23T05:33:13Z Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.392626 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393255 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" exitCode=255 Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393310 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e"} Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.393453 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:49 crc kubenswrapper[4755]: I0320 13:30:49.394556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:50 crc kubenswrapper[4755]: I0320 13:30:50.166462 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:50Z is after 2026-02-23T05:33:13Z Mar 20 13:30:51 crc kubenswrapper[4755]: I0320 13:30:51.167057 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:51Z is after 2026-02-23T05:33:13Z Mar 20 13:30:51 crc kubenswrapper[4755]: E0320 13:30:51.314088 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.168340 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:52Z is after 2026-02-23T05:33:13Z Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.961821 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.962065 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:52 crc kubenswrapper[4755]: I0320 13:30:52.963904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:53 crc kubenswrapper[4755]: I0320 13:30:53.168917 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:53Z is after 2026-02-23T05:33:13Z Mar 20 13:30:54 crc kubenswrapper[4755]: I0320 13:30:54.167818 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:54Z is after 2026-02-23T05:33:13Z Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.167980 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:55Z is after 2026-02-23T05:33:13Z Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.741704 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.741942 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743644 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:55 crc kubenswrapper[4755]: I0320 13:30:55.743838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:55 crc kubenswrapper[4755]: E0320 13:30:55.748609 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:30:56 crc kubenswrapper[4755]: E0320 13:30:56.147321 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.153792 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155356 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.155427 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:30:56 crc kubenswrapper[4755]: E0320 13:30:56.160296 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:30:56 crc kubenswrapper[4755]: I0320 13:30:56.167192 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:56Z is after 2026-02-23T05:33:13Z Mar 20 13:30:57 crc kubenswrapper[4755]: I0320 13:30:57.167425 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:57Z is after 2026-02-23T05:33:13Z Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.168054 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:58Z is after 2026-02-23T05:33:13Z Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.225196 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.226897 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.227726 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.741866 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:30:58 crc kubenswrapper[4755]: I0320 13:30:58.742020 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.169623 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:30:59Z is after 2026-02-23T05:33:13Z Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.429464 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.431708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1"} Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.431887 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:30:59 crc kubenswrapper[4755]: I0320 13:30:59.433151 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.167906 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:00Z is after 2026-02-23T05:33:13Z Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.437801 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.438693 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441631 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" exitCode=255 Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1"} Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441781 4755 scope.go:117] "RemoveContainer" containerID="d5df1b8fcc02adbc88014730bc6e062080600c403524c597ddfbeee60be61163" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.441987 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443253 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443314 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.443339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:00 crc kubenswrapper[4755]: I0320 13:31:00.444384 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:00 crc kubenswrapper[4755]: E0320 13:31:00.444796 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.166761 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:01Z is after 2026-02-23T05:33:13Z Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.314581 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.324355 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.330807 4755 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:01 crc kubenswrapper[4755]: E0320 13:31:01.332130 4755 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 13:31:01 crc kubenswrapper[4755]: I0320 13:31:01.448341 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:02 crc kubenswrapper[4755]: I0320 13:31:02.168288 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:02Z is after 2026-02-23T05:33:13Z Mar 20 13:31:03 crc kubenswrapper[4755]: E0320 13:31:03.153754 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.161183 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162498 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162509 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.162556 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:03 crc kubenswrapper[4755]: I0320 13:31:03.167335 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z Mar 20 13:31:03 crc kubenswrapper[4755]: E0320 13:31:03.167738 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:04 crc kubenswrapper[4755]: I0320 13:31:04.166300 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:04Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: I0320 13:31:05.168341 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: W0320 13:31:05.269704 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z Mar 20 13:31:05 crc kubenswrapper[4755]: E0320 13:31:05.269829 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:05 crc kubenswrapper[4755]: E0320 13:31:05.754689 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:06 crc kubenswrapper[4755]: W0320 13:31:06.139060 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z Mar 20 13:31:06 crc kubenswrapper[4755]: E0320 13:31:06.139206 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.168991 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:06Z is after 2026-02-23T05:33:13Z Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.967702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.967983 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969727 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.969800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:06 crc kubenswrapper[4755]: I0320 13:31:06.970708 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:06 crc kubenswrapper[4755]: E0320 13:31:06.971177 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:07 crc kubenswrapper[4755]: I0320 13:31:07.167150 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:07Z is after 2026-02-23T05:33:13Z Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.168522 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:08Z is after 2026-02-23T05:33:13Z Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.741824 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.741976 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.746220 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.746482 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748229 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.748323 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:08 crc kubenswrapper[4755]: I0320 13:31:08.749245 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:08 crc kubenswrapper[4755]: E0320 13:31:08.749540 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:09 crc kubenswrapper[4755]: I0320 13:31:09.166361 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:09Z is after 2026-02-23T05:33:13Z Mar 20 13:31:10 crc kubenswrapper[4755]: E0320 13:31:10.159885 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.167878 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.167973 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169686 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:10 crc kubenswrapper[4755]: I0320 13:31:10.169769 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:10 crc kubenswrapper[4755]: E0320 13:31:10.175114 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:11 crc kubenswrapper[4755]: W0320 13:31:11.008031 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.008143 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:11 crc kubenswrapper[4755]: W0320 13:31:11.114931 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.115084 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:31:11 crc kubenswrapper[4755]: I0320 13:31:11.167410 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:11Z is after 2026-02-23T05:33:13Z Mar 20 13:31:11 crc kubenswrapper[4755]: E0320 13:31:11.314735 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:12 crc kubenswrapper[4755]: I0320 13:31:12.168054 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:12Z is after 2026-02-23T05:33:13Z Mar 20 13:31:13 crc kubenswrapper[4755]: I0320 13:31:13.167455 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:13Z is after 2026-02-23T05:33:13Z Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.167765 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:14Z is after 2026-02-23T05:33:13Z Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.260836 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.261125 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262846 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:14 crc kubenswrapper[4755]: I0320 13:31:14.262951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:15 crc kubenswrapper[4755]: I0320 13:31:15.167812 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:15Z is after 2026-02-23T05:33:13Z Mar 20 13:31:15 crc kubenswrapper[4755]: E0320 13:31:15.761835 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:16 crc kubenswrapper[4755]: I0320 13:31:16.167744 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:16Z is after 2026-02-23T05:33:13Z Mar 20 13:31:17 crc kubenswrapper[4755]: E0320 13:31:17.165225 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.167753 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.175900 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:17 crc kubenswrapper[4755]: I0320 13:31:17.177988 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:17 crc kubenswrapper[4755]: E0320 13:31:17.181394 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.168432 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:18Z is after 2026-02-23T05:33:13Z Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742007 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742148 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742234 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.742484 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744147 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.744947 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:31:18 crc kubenswrapper[4755]: I0320 13:31:18.745109 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8" gracePeriod=30 Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.169205 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:19Z is after 2026-02-23T05:33:13Z Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.520906 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.523081 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.523971 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8" exitCode=255 Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524052 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8"} Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3"} Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524148 4755 scope.go:117] "RemoveContainer" containerID="ad40c7015ee1d0f5868379a57ef2217a22dd88cb925d582d221c125db652e78e" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.524284 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.525949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.526015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:19 crc kubenswrapper[4755]: I0320 13:31:19.526042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:20 crc kubenswrapper[4755]: I0320 13:31:20.164766 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:20Z is after 2026-02-23T05:33:13Z Mar 20 13:31:20 crc kubenswrapper[4755]: I0320 13:31:20.530306 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.167599 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:13Z Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.224640 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226303 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.226393 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.227379 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:21 crc kubenswrapper[4755]: E0320 13:31:21.315104 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.536938 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.538155 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e"} Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.538306 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539026 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:21 crc kubenswrapper[4755]: I0320 13:31:21.539065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.167948 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:13Z Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.545105 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.545961 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548690 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" exitCode=255 Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e"} Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.548786 4755 scope.go:117] "RemoveContainer" containerID="2b0f1a1f64f8af9559f9bf7078ec0fdde8ae1640f23996277a3182b12a8a0ea1" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.549024 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.550867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.551458 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:22 crc kubenswrapper[4755]: E0320 13:31:22.551615 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.962415 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.962722 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:22 crc kubenswrapper[4755]: I0320 13:31:22.964625 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:23 crc kubenswrapper[4755]: I0320 13:31:23.168031 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:23 crc kubenswrapper[4755]: I0320 13:31:23.555043 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:31:24 crc kubenswrapper[4755]: E0320 13:31:24.167766 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.167896 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.181930 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183095 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183129 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:24 crc kubenswrapper[4755]: I0320 13:31:24.183162 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:24 crc kubenswrapper[4755]: E0320 13:31:24.190959 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.167876 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.741755 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.742030 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743816 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743884 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:25 crc kubenswrapper[4755]: I0320 13:31:25.743901 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.771242 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb2d31970 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,LastTimestamp:2026-03-20 13:30:21.156768112 +0000 UTC m=+0.754700641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.776476 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.781837 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.787087 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.792934 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fcebc88d49d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.319672989 +0000 UTC m=+0.917605528,LastTimestamp:2026-03-20 13:30:21.319672989 +0000 UTC m=+0.917605528,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.800187 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.327144893 +0000 UTC m=+0.925077422,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.806813 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.327171834 +0000 UTC m=+0.925104363,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.811988 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.327182474 +0000 UTC m=+0.925115003,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.817982 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.32840772 +0000 UTC m=+0.926340249,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.822543 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.32841782 +0000 UTC m=+0.926350349,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.826747 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.328426481 +0000 UTC m=+0.926359000,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.832185 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.329498603 +0000 UTC m=+0.927431142,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.840573 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.329522533 +0000 UTC m=+0.927455072,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.846025 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.329534333 +0000 UTC m=+0.927466872,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.852371 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.331199138 +0000 UTC m=+0.929131677,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.857435 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.331213869 +0000 UTC m=+0.929146408,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.863248 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.331224599 +0000 UTC m=+0.929157138,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.867500 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.333021076 +0000 UTC m=+0.930953615,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.871191 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.333035517 +0000 UTC m=+0.930968056,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.876490 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.333046477 +0000 UTC m=+0.930979016,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.885829 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.333910635 +0000 UTC m=+0.931843174,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.891939 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.333925125 +0000 UTC m=+0.931857664,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.896823 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c678d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c678d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215049613 +0000 UTC m=+0.812982152,LastTimestamp:2026-03-20 13:30:21.333935245 +0000 UTC m=+0.931867784,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.902952 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64ba5f4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64ba5f4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215000052 +0000 UTC m=+0.812932591,LastTimestamp:2026-03-20 13:30:21.335036238 +0000 UTC m=+0.932968777,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.909908 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8fceb64c289f\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8fceb64c289f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.215033503 +0000 UTC m=+0.812966042,LastTimestamp:2026-03-20 13:30:21.335050168 +0000 UTC m=+0.932982707,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.918140 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fced5103449 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.731198025 +0000 UTC m=+1.329130564,LastTimestamp:2026-03-20 13:30:21.731198025 +0000 UTC m=+1.329130564,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.923422 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fced568a5bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.736994235 +0000 UTC m=+1.334926764,LastTimestamp:2026-03-20 13:30:21.736994235 +0000 UTC m=+1.334926764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.929352 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fced5977e8f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.740064399 +0000 UTC m=+1.337996958,LastTimestamp:2026-03-20 13:30:21.740064399 +0000 UTC m=+1.337996958,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.935705 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fced627be29 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.749517865 +0000 UTC m=+1.347450424,LastTimestamp:2026-03-20 13:30:21.749517865 +0000 UTC m=+1.347450424,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.942578 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fced691c4e0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:21.7564664 +0000 UTC m=+1.354398969,LastTimestamp:2026-03-20 13:30:21.7564664 +0000 UTC m=+1.354398969,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.948137 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf1231daf6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.756813558 +0000 UTC m=+2.354746127,LastTimestamp:2026-03-20 13:30:22.756813558 +0000 UTC m=+2.354746127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.952735 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf13b67c8c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782282892 +0000 UTC m=+2.380215461,LastTimestamp:2026-03-20 13:30:22.782282892 +0000 UTC m=+2.380215461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.956914 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf13ba920a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782550538 +0000 UTC m=+2.380483097,LastTimestamp:2026-03-20 13:30:22.782550538 +0000 UTC m=+2.380483097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.961788 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf13c0b3b9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782952377 +0000 UTC m=+2.380884946,LastTimestamp:2026-03-20 13:30:22.782952377 +0000 UTC m=+2.380884946,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.968681 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf13c11c91 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.782979217 +0000 UTC m=+2.380911776,LastTimestamp:2026-03-20 13:30:22.782979217 +0000 UTC m=+2.380911776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.976816 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf200d7b27 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:22.989310759 +0000 UTC m=+2.587243318,LastTimestamp:2026-03-20 13:30:22.989310759 +0000 UTC m=+2.587243318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.983949 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf27b70181 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.117861249 +0000 UTC m=+2.715793808,LastTimestamp:2026-03-20 13:30:23.117861249 +0000 UTC m=+2.715793808,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.987910 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf280e62ac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.123587756 +0000 UTC m=+2.721520315,LastTimestamp:2026-03-20 13:30:23.123587756 +0000 UTC m=+2.721520315,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.992901 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf2813d9f6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.123945974 +0000 UTC m=+2.721878543,LastTimestamp:2026-03-20 13:30:23.123945974 +0000 UTC m=+2.721878543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:25 crc kubenswrapper[4755]: E0320 13:31:25.996475 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf281d1979 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.124552057 +0000 UTC m=+2.722484626,LastTimestamp:2026-03-20 13:30:23.124552057 +0000 UTC m=+2.722484626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.000767 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf28517c01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,LastTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.004742 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf2f75c2bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.247803069 +0000 UTC m=+2.845735598,LastTimestamp:2026-03-20 13:30:23.247803069 +0000 UTC m=+2.845735598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.008900 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf2fcedf50 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.253643088 +0000 UTC m=+2.851575617,LastTimestamp:2026-03-20 13:30:23.253643088 +0000 UTC m=+2.851575617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.013753 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf3026ebbc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.259413436 +0000 UTC m=+2.857346005,LastTimestamp:2026-03-20 13:30:23.259413436 +0000 UTC m=+2.857346005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.017490 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf303558a1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.260358817 +0000 UTC m=+2.858291386,LastTimestamp:2026-03-20 13:30:23.260358817 +0000 UTC m=+2.858291386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.021156 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf3d530922 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.480408354 +0000 UTC m=+3.078340883,LastTimestamp:2026-03-20 13:30:23.480408354 +0000 UTC m=+3.078340883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.026084 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3d634259 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,LastTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.030531 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3d63e46c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481513068 +0000 UTC m=+3.079445597,LastTimestamp:2026-03-20 13:30:23.481513068 +0000 UTC m=+3.079445597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.035418 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf3d6a07bf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481915327 +0000 UTC m=+3.079847856,LastTimestamp:2026-03-20 13:30:23.481915327 +0000 UTC m=+3.079847856,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.039846 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf3dbbc93c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.487273276 +0000 UTC m=+3.085205815,LastTimestamp:2026-03-20 13:30:23.487273276 +0000 UTC m=+3.085205815,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.044634 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3e09484e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.492352078 +0000 UTC m=+3.090284607,LastTimestamp:2026-03-20 13:30:23.492352078 +0000 UTC m=+3.090284607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.048571 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf3e163ba8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.493200808 +0000 UTC m=+3.091133337,LastTimestamp:2026-03-20 13:30:23.493200808 +0000 UTC m=+3.091133337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.053038 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e469596 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,LastTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.058201 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e538dda openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.497219546 +0000 UTC m=+3.095152075,LastTimestamp:2026-03-20 13:30:23.497219546 +0000 UTC m=+3.095152075,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.064143 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8fcf40083710 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.52583656 +0000 UTC m=+3.123769089,LastTimestamp:2026-03-20 13:30:23.52583656 +0000 UTC m=+3.123769089,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.068610 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4031bb1c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.52855734 +0000 UTC m=+3.126489869,LastTimestamp:2026-03-20 13:30:23.52855734 +0000 UTC m=+3.126489869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.073144 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf403a414f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.529115983 +0000 UTC m=+3.127048512,LastTimestamp:2026-03-20 13:30:23.529115983 +0000 UTC m=+3.127048512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.078047 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4056bf88 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.530983304 +0000 UTC m=+3.128915833,LastTimestamp:2026-03-20 13:30:23.530983304 +0000 UTC m=+3.128915833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.082564 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4b256111 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.712297233 +0000 UTC m=+3.310229802,LastTimestamp:2026-03-20 13:30:23.712297233 +0000 UTC m=+3.310229802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.088082 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4b81cbad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.718353837 +0000 UTC m=+3.316286366,LastTimestamp:2026-03-20 13:30:23.718353837 +0000 UTC m=+3.316286366,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.094093 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4c887f70 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.735570288 +0000 UTC m=+3.333502847,LastTimestamp:2026-03-20 13:30:23.735570288 +0000 UTC m=+3.333502847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.098051 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf4ca21654 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.737247316 +0000 UTC m=+3.335179845,LastTimestamp:2026-03-20 13:30:23.737247316 +0000 UTC m=+3.335179845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.102841 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4d049fbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.743705019 +0000 UTC m=+3.341637558,LastTimestamp:2026-03-20 13:30:23.743705019 +0000 UTC m=+3.341637558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.108478 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4d2b82cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.746253515 +0000 UTC m=+3.344186054,LastTimestamp:2026-03-20 13:30:23.746253515 +0000 UTC m=+3.344186054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.113789 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf4d57b66f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.749150319 +0000 UTC m=+3.347082848,LastTimestamp:2026-03-20 13:30:23.749150319 +0000 UTC m=+3.347082848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.118366 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4f0dd58f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.777863055 +0000 UTC m=+3.375795584,LastTimestamp:2026-03-20 13:30:23.777863055 +0000 UTC m=+3.375795584,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.123685 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf4f27ad05 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.779556613 +0000 UTC m=+3.377489152,LastTimestamp:2026-03-20 13:30:23.779556613 +0000 UTC m=+3.377489152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.128720 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf59d18c94 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.958461588 +0000 UTC m=+3.556394117,LastTimestamp:2026-03-20 13:30:23.958461588 +0000 UTC m=+3.556394117,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.138908 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf5a5b7c56 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.967501398 +0000 UTC m=+3.565433927,LastTimestamp:2026-03-20 13:30:23.967501398 +0000 UTC m=+3.565433927,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.143542 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf5b6859cb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.985121739 +0000 UTC m=+3.583054268,LastTimestamp:2026-03-20 13:30:23.985121739 +0000 UTC m=+3.583054268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.147707 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8fcf5bb4f1ba openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.99014137 +0000 UTC m=+3.588073899,LastTimestamp:2026-03-20 13:30:23.99014137 +0000 UTC m=+3.588073899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.151975 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5c79045b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.002991195 +0000 UTC m=+3.600923724,LastTimestamp:2026-03-20 13:30:24.002991195 +0000 UTC m=+3.600923724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.156277 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5dc1e79e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.024545182 +0000 UTC m=+3.622477711,LastTimestamp:2026-03-20 13:30:24.024545182 +0000 UTC m=+3.622477711,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.160170 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf5dd735ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.025941454 +0000 UTC m=+3.623873993,LastTimestamp:2026-03-20 13:30:24.025941454 +0000 UTC m=+3.623873993,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.166226 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.166179 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6a6a16de openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.236893918 +0000 UTC m=+3.834826477,LastTimestamp:2026-03-20 13:30:24.236893918 +0000 UTC m=+3.834826477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.169966 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b766474 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.254477428 +0000 UTC m=+3.852409967,LastTimestamp:2026-03-20 13:30:24.254477428 +0000 UTC m=+3.852409967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.175016 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b8ba7db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,LastTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.181041 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf6c6e562a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.270726698 +0000 UTC m=+3.868659257,LastTimestamp:2026-03-20 13:30:24.270726698 +0000 UTC m=+3.868659257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.186724 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf78b4dcb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,LastTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.193063 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf78f6a60e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.480986638 +0000 UTC m=+4.078919167,LastTimestamp:2026-03-20 13:30:24.480986638 +0000 UTC m=+4.078919167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.194805 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf79938738 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,LastTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.200743 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcf7a15ddbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.499809725 +0000 UTC m=+4.097742254,LastTimestamp:2026-03-20 13:30:24.499809725 +0000 UTC m=+4.097742254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.206889 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfa936342d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.290458157 +0000 UTC m=+4.888390726,LastTimestamp:2026-03-20 13:30:25.290458157 +0000 UTC m=+4.888390726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.212065 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb6898698 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.514022552 +0000 UTC m=+5.111955081,LastTimestamp:2026-03-20 13:30:25.514022552 +0000 UTC m=+5.111955081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.216182 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb747bda8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.526488488 +0000 UTC m=+5.124421017,LastTimestamp:2026-03-20 13:30:25.526488488 +0000 UTC m=+5.124421017,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.220670 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfb75abf96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.527734166 +0000 UTC m=+5.125666695,LastTimestamp:2026-03-20 13:30:25.527734166 +0000 UTC m=+5.125666695,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.226097 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc4d36f69 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.753747305 +0000 UTC m=+5.351679874,LastTimestamp:2026-03-20 13:30:25.753747305 +0000 UTC m=+5.351679874,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.230634 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc58d1b0c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.765915404 +0000 UTC m=+5.363847973,LastTimestamp:2026-03-20 13:30:25.765915404 +0000 UTC m=+5.363847973,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.235446 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfc5a51121 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.767485729 +0000 UTC m=+5.365418248,LastTimestamp:2026-03-20 13:30:25.767485729 +0000 UTC m=+5.365418248,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.239792 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd3377299 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:25.995182745 +0000 UTC m=+5.593115274,LastTimestamp:2026-03-20 13:30:25.995182745 +0000 UTC m=+5.593115274,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.243806 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd4217b55 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.010520405 +0000 UTC m=+5.608452934,LastTimestamp:2026-03-20 13:30:26.010520405 +0000 UTC m=+5.608452934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.250314 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfd43202df openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.011603679 +0000 UTC m=+5.609536208,LastTimestamp:2026-03-20 13:30:26.011603679 +0000 UTC m=+5.609536208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.255101 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe30318fc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.260187388 +0000 UTC m=+5.858119917,LastTimestamp:2026-03-20 13:30:26.260187388 +0000 UTC m=+5.858119917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.259144 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe4323456 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.280051798 +0000 UTC m=+5.877984327,LastTimestamp:2026-03-20 13:30:26.280051798 +0000 UTC m=+5.877984327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.266107 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcfe45e94f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.282960113 +0000 UTC m=+5.880892642,LastTimestamp:2026-03-20 13:30:26.282960113 +0000 UTC m=+5.880892642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.270645 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcff14b2bb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.499791798 +0000 UTC m=+6.097724337,LastTimestamp:2026-03-20 13:30:26.499791798 +0000 UTC m=+6.097724337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.278239 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8fcff2685703 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:26.518480643 +0000 UTC m=+6.116413192,LastTimestamp:2026-03-20 13:30:26.518480643 +0000 UTC m=+6.116413192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.285882 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.289483 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.295589 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c1a904 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:31:26 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:31:26 crc kubenswrapper[4755]: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.75180314 +0000 UTC m=+15.349735679,LastTimestamp:2026-03-20 13:30:35.75180314 +0000 UTC m=+15.349735679,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.296775 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c33957 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,LastTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.299855 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-apiserver-crc.189e8fd2190d0fd8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:31:26 crc kubenswrapper[4755]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\": RBAC: [clusterrole.rbac.authorization.k8s.io \"system:public-info-viewer\" not found, clusterrole.rbac.authorization.k8s.io \"system:openshift:public-info-viewer\" not found]","reason":"Forbidden","details":{},"code":403} Mar 20 13:31:26 crc kubenswrapper[4755]: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.756744664 +0000 UTC m=+15.354677203,LastTimestamp:2026-03-20 13:30:35.756744664 +0000 UTC m=+15.354677203,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.305281 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fd218c33957\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fd218c33957 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:35.751905623 +0000 UTC m=+15.349838162,LastTimestamp:2026-03-20 13:30:35.756803085 +0000 UTC m=+15.354735634,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.309313 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf6b8ba7db\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf6b8ba7db openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.255870939 +0000 UTC m=+3.853803468,LastTimestamp:2026-03-20 13:30:36.337002727 +0000 UTC m=+15.934935256,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.313712 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf78b4dcb5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf78b4dcb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.476675253 +0000 UTC m=+4.074607772,LastTimestamp:2026-03-20 13:30:36.589389998 +0000 UTC m=+16.187322527,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.317821 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8fcf79938738\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8fcf79938738 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:24.491267896 +0000 UTC m=+4.089200425,LastTimestamp:2026-03-20 13:30:36.606442021 +0000 UTC m=+16.204374550,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.322395 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:38.743146511 +0000 UTC m=+18.341079080,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.327018 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:38.743235624 +0000 UTC m=+18.341168183,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.332701 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:48.742787215 +0000 UTC m=+28.340719784,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.337862 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:48.74296303 +0000 UTC m=+28.340895599,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.343031 4755 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd51f4deae4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:48.746560228 +0000 UTC m=+28.344492797,LastTimestamp:2026-03-20 13:30:48.746560228 +0000 UTC m=+28.344492797,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.349286 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf28517c01\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf28517c01 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.127985153 +0000 UTC m=+2.725917692,LastTimestamp:2026-03-20 13:30:48.865021428 +0000 UTC m=+28.462953957,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.354766 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf3d634259\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3d634259 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.481471577 +0000 UTC m=+3.079404106,LastTimestamp:2026-03-20 13:30:49.095701028 +0000 UTC m=+28.693633557,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.360905 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fcf3e469596\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fcf3e469596 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:23.496369558 +0000 UTC m=+3.094302087,LastTimestamp:2026-03-20 13:30:49.107178 +0000 UTC m=+28.705110569,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.366779 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:30:58.741974225 +0000 UTC m=+38.339906784,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.371219 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e8af46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e8af46 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741484358 +0000 UTC m=+8.339416897,LastTimestamp:2026-03-20 13:30:58.742061577 +0000 UTC m=+38.339994146,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.377043 4755 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8fd076e66764\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:31:26 crc kubenswrapper[4755]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8fd076e66764 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:31:26 crc kubenswrapper[4755]: body: Mar 20 13:31:26 crc kubenswrapper[4755]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:30:28.741334884 +0000 UTC m=+8.339267423,LastTimestamp:2026-03-20 13:31:08.741933408 +0000 UTC m=+48.339865977,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:31:26 crc kubenswrapper[4755]: > Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.967614 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.967844 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.968971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:26 crc kubenswrapper[4755]: I0320 13:31:26.969587 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:26 crc kubenswrapper[4755]: E0320 13:31:26.969819 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:27 crc kubenswrapper[4755]: I0320 13:31:27.170274 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.167982 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.742538 4755 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.742630 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.746349 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.746563 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.748359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:28 crc kubenswrapper[4755]: I0320 13:31:28.749367 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:28 crc kubenswrapper[4755]: E0320 13:31:28.749705 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:29 crc kubenswrapper[4755]: I0320 13:31:29.166929 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:30 crc kubenswrapper[4755]: I0320 13:31:30.171385 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.170407 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.171251 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.191366 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193388 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:31 crc kubenswrapper[4755]: I0320 13:31:31.193483 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.200680 4755 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.316598 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:31 crc kubenswrapper[4755]: W0320 13:31:31.451161 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:31:31 crc kubenswrapper[4755]: E0320 13:31:31.451254 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:31:32 crc kubenswrapper[4755]: I0320 13:31:32.170953 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:32 crc kubenswrapper[4755]: W0320 13:31:32.827737 4755 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:31:32 crc kubenswrapper[4755]: E0320 13:31:32.827813 4755 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.167214 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.333689 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:31:33 crc kubenswrapper[4755]: I0320 13:31:33.358075 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:31:34 crc kubenswrapper[4755]: I0320 13:31:34.173548 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.170254 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.749692 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.749961 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.751540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:35 crc kubenswrapper[4755]: I0320 13:31:35.757417 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.168241 4755 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.596108 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.597906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.897541 4755 csr.go:261] certificate signing request csr-xt5nl is approved, waiting to be issued Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.909006 4755 csr.go:257] certificate signing request csr-xt5nl is issued Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.954540 4755 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:31:36 crc kubenswrapper[4755]: I0320 13:31:36.979475 4755 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:31:37 crc kubenswrapper[4755]: I0320 13:31:37.911286 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-28 07:34:26.827083117 +0000 UTC Mar 20 13:31:37 crc kubenswrapper[4755]: I0320 13:31:37.911351 4755 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6786h2m48.915736057s for next certificate rotation Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.201899 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.203931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.204149 4755 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.216520 4755 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.217016 4755 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.217060 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222638 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.222719 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.243145 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.253696 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.269966 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281562 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.281588 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.298319 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:38 crc kubenswrapper[4755]: I0320 13:31:38.311288 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:38Z","lastTransitionTime":"2026-03-20T13:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328562 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328832 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.328877 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.429182 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.529780 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.630748 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.731196 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.832086 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:38 crc kubenswrapper[4755]: E0320 13:31:38.933093 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.033479 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.134442 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.235537 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.336295 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.436722 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.537744 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.638199 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.738801 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.840008 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:39 crc kubenswrapper[4755]: E0320 13:31:39.941138 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.042247 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.143235 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.225061 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226904 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:40 crc kubenswrapper[4755]: I0320 13:31:40.226920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.244421 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.345506 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.446486 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.546897 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:40 crc kubenswrapper[4755]: E0320 13:31:40.647747 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.106450 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.206634 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.307105 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.317128 4755 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.408049 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.508999 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.609232 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.710264 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.810757 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:41 crc kubenswrapper[4755]: E0320 13:31:41.911800 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.012591 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.113629 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.214264 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.225039 4755 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226817 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.226837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.227891 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.228196 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.314922 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: I0320 13:31:42.360934 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.415495 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.516373 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.617330 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.718344 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.819040 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:42 crc kubenswrapper[4755]: E0320 13:31:42.919257 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.020052 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.120672 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.221343 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.322294 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.423227 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.524145 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.624513 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.724767 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.825775 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:43 crc kubenswrapper[4755]: E0320 13:31:43.926469 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.027600 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.128341 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.229260 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.329634 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.430738 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.531721 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.632332 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.732969 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.833139 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:44 crc kubenswrapper[4755]: E0320 13:31:44.933947 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.034995 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.135853 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.236800 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.337552 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.438521 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.539697 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.639798 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.740884 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.841725 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:45 crc kubenswrapper[4755]: E0320 13:31:45.942728 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.043889 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.144065 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.244957 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: E0320 13:31:46.346097 4755 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.382914 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.448935 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552805 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552890 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.552982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.655946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.655996 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.656044 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.758896 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.862939 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966777 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:46 crc kubenswrapper[4755]: I0320 13:31:46.966815 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:46Z","lastTransitionTime":"2026-03-20T13:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069417 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.069518 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.172324 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.180636 4755 apiserver.go:52] "Watching apiserver" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.186082 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.186506 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187011 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187063 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187235 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.187636 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.187906 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.188128 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.188201 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.190027 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191592 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191670 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.191815 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.194225 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.194371 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.195018 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.228405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.251383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.262216 4755 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.267295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.275316 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.280492 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.295409 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.309336 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.326477 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.336585 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352317 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352386 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352441 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352469 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352516 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352566 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352588 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352617 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352641 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352734 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352723 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352756 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352822 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352874 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352904 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352920 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352928 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.352982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353075 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353076 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353130 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353199 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353218 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353243 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353289 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353386 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353493 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353560 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353708 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353715 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353742 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353775 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353872 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353927 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353949 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353952 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.353993 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354013 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354018 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354087 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354103 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354138 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354169 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354222 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354240 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354303 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354323 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354356 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354391 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354448 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354485 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354501 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354576 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354611 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354688 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354722 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354760 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354776 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354893 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354931 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354998 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355035 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355057 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355073 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355091 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355108 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355125 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355159 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355176 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355194 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355213 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355233 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355255 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355346 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355366 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355453 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355509 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355554 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356564 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356910 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356949 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356995 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357032 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357184 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357220 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357257 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357290 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357370 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357449 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357490 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357525 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357848 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357884 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357959 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358004 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358127 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358163 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358196 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358258 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358290 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358631 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358769 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358808 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358882 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358913 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358944 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.358977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359047 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359116 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359157 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359196 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359264 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359298 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359335 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359371 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359447 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359482 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361090 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361145 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362217 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362244 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362343 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362433 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362556 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362701 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362738 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362816 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362833 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362851 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362867 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362905 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362927 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362954 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362972 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362992 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363015 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363034 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363054 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363076 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354115 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354418 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354446 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354505 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354587 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354693 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.354852 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355471 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355617 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.355942 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.356315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357178 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357730 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357910 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.357935 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.359538 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360096 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366614 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360163 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360322 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360638 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360757 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361290 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.360952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.361980 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362195 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.362508 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363740 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.363829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364009 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364512 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.364998 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365172 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365788 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.365903 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366111 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366279 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.366909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367474 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367926 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367936 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.367948 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368016 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368103 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368353 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.368368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369389 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369481 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.369832 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.369984 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370362 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370360 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.370536 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.870500899 +0000 UTC m=+87.468433618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.372187 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.872169886 +0000 UTC m=+87.470102645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.370926 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371094 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371108 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371294 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371359 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371438 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371460 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.371673 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.372564 4755 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.373025 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.373107 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.873085171 +0000 UTC m=+87.471017710 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.373523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.373600 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387433 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387476 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387499 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.387586 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.887558661 +0000 UTC m=+87.485491380 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.390189 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391136 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.391245 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.392807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.392892 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393230 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393255 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393706 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.393718 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393827 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393866 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.393881 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.394334 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.394380 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:47.893918197 +0000 UTC m=+87.491850936 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.394556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395012 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395342 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395475 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395741 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.395795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.396174 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.397358 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.398754 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.398965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.399014 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.403873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407870 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.407939 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408106 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408293 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408343 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.408422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.409684 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410220 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410570 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.410845 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411501 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.411839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412275 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412330 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.412645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413212 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413310 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414252 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414309 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414434 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414646 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414755 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.413748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.414959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415373 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415466 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415510 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415544 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415589 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415594 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.415817 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416033 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416084 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416245 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416440 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416700 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416821 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.416875 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417110 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417125 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.417952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418756 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.418900 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419178 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419235 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419271 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419669 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.419918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.420590 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.420695 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.421143 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.421620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.427529 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.434882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.441364 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.444712 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.454518 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464329 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464353 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464398 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464513 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.464924 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465055 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465158 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465257 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465380 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465498 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465852 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.465971 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466091 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466210 4755 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466304 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466392 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466474 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466549 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466633 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466746 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466838 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.466930 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467014 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467097 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467180 4755 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467280 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467365 4755 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467451 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467574 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467675 4755 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467756 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467847 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.467934 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468016 4755 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468094 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468175 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468250 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468332 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468532 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468629 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468758 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468852 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.468934 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469019 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469109 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469205 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469290 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469366 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469445 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469548 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469630 4755 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469748 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469832 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469910 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.469984 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470082 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470190 4755 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470295 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470392 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470480 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470566 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470648 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470775 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470877 4755 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.470988 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471097 4755 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471219 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471332 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471444 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471587 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471730 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471862 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.471982 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472091 4755 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472179 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472260 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472335 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472424 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472501 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472581 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472704 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472814 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.472934 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473061 4755 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473151 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473228 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473321 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473498 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473588 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473695 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473776 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473850 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473924 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.473997 4755 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474091 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474170 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474252 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474326 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474551 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474636 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474750 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.474834 4755 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475162 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475241 4755 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475323 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475433 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475541 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475628 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475736 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475771 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475790 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475804 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475819 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475835 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475850 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475863 4755 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475878 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475891 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475905 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475917 4755 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475930 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475943 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475956 4755 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475968 4755 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.475985 4755 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476002 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476017 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476032 4755 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476045 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476058 4755 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476092 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476106 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476121 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476136 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476150 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476175 4755 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476188 4755 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476200 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476213 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476226 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476238 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476251 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476264 4755 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476277 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476291 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476304 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476316 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476328 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476341 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476353 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476366 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476379 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476400 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476414 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476426 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476438 4755 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476450 4755 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476462 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476475 4755 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476490 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476502 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476515 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476527 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476542 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476556 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476571 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476584 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476599 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476612 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476624 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476640 4755 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476677 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476689 4755 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476703 4755 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476716 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476727 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476740 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476752 4755 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476763 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476775 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.476787 4755 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.494580 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.508376 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.527573 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.532717 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.536027 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.536311 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:31:47 crc kubenswrapper[4755]: W0320 13:31:47.538583 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd WatchSource:0}: Error finding container 2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd: Status 404 returned error can't find the container with id 2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.542363 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:31:47 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:31:47 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:31:47 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:31:47 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:31:47 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:31:47 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:31:47 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:31:47 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.547899 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:31:47 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.549715 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.551177 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:31:47 crc kubenswrapper[4755]: else Mar 20 13:31:47 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:31:47 crc kubenswrapper[4755]: exit 1 Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.552338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.597571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.630885 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"ad044424dec10a9b85e73f38d92753d33328812ddc6316e8013c584622dfca9b"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.632431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2b8dfc203055c0f7ff1ac4bceac4dd347eca86fc33502707e7905100327553fd"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.633916 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8c6f52fc6626fb0935c8fd231551aaddd00cb052a5ed9129d1ae96ea000b0a56"} Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.634276 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:31:47 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:31:47 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:31:47 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:31:47 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:31:47 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:31:47 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:31:47 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:31:47 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.636040 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.636242 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:31:47 crc kubenswrapper[4755]: else Mar 20 13:31:47 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:31:47 crc kubenswrapper[4755]: exit 1 Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637320 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637421 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.637484 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:31:47 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:31:47 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:31:47 crc kubenswrapper[4755]: set -o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:31:47 crc kubenswrapper[4755]: set +o allexport Mar 20 13:31:47 crc kubenswrapper[4755]: fi Mar 20 13:31:47 crc kubenswrapper[4755]: Mar 20 13:31:47 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:31:47 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:31:47 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:31:47 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:31:47 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:31:47 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:31:47 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.639098 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.648253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.661640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.677626 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.693594 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.701957 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.705880 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.717345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.730207 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.743091 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.756762 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.773547 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.789631 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.805148 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.812227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880202 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880542 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880632 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.880603309 +0000 UTC m=+88.478535878 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880711 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.880688932 +0000 UTC m=+88.478621641 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.880752 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.880933 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.881007 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.88098827 +0000 UTC m=+88.478920839 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908201 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.908215 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:47Z","lastTransitionTime":"2026-03-20T13:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.982198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:47 crc kubenswrapper[4755]: I0320 13:31:47.982307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982531 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982571 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982597 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982645 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982716 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982736 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982755 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.982722807 +0000 UTC m=+88.580655376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:47 crc kubenswrapper[4755]: E0320 13:31:47.982824 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:48.982792078 +0000 UTC m=+88.580724617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011524 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011545 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.011560 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114355 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.114367 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216794 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216819 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.216829 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.224953 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.225043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319692 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.319757 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.422980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.423008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.423025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525842 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.525879 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.543939 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.544571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.559462 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.564261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.576263 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.582392 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.599794 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604845 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.604860 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.621379 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.627154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.641530 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.641708 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644490 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644528 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.644543 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747368 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747448 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.747459 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850797 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850879 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.850890 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.891928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892062 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892100 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892141 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.892096268 +0000 UTC m=+90.490028827 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892208 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.89218332 +0000 UTC m=+90.490115849 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.892236 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.892227671 +0000 UTC m=+90.490160200 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.953233 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:48Z","lastTransitionTime":"2026-03-20T13:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.993139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:48 crc kubenswrapper[4755]: I0320 13:31:48.993253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993495 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993557 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993594 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993552 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993646 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993701 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993742 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.99371271 +0000 UTC m=+90.591645269 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:48 crc kubenswrapper[4755]: E0320 13:31:48.993797 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:50.993772982 +0000 UTC m=+90.591705511 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055890 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055972 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.055999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.056017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.158956 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.225253 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.225353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:49 crc kubenswrapper[4755]: E0320 13:31:49.225583 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:49 crc kubenswrapper[4755]: E0320 13:31:49.225743 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.229792 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.230342 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.231765 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.232400 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.233430 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.233932 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.234564 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.235555 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.236295 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.237386 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.237961 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.239084 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.239578 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.240172 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.241089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.241604 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.242580 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.242992 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.243572 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.244592 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.245068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.246020 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.246460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.247461 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.247925 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.248576 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.249666 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.250129 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.251169 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.251606 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.252493 4755 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.252613 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.254218 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.255090 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.255549 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.257102 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.257876 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.258797 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.259448 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.260871 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.261380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.261985 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262107 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.262460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.263166 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.264125 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.264591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.265495 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.266087 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.267135 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.267627 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.268441 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.268965 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.269880 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.270442 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.271068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.365222 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468844 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468870 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.468888 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572105 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572177 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572235 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.572260 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675748 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675780 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.675794 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779104 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779217 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.779295 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.882946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.882990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.883025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:49 crc kubenswrapper[4755]: I0320 13:31:49.986399 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:49Z","lastTransitionTime":"2026-03-20T13:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089517 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.089527 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192740 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192821 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.192835 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.224538 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.224685 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.295925 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.399561 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.502713 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608225 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.608269 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712413 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.712552 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815455 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.815471 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911049 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911128 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.911165 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911299 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911301 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911239359 +0000 UTC m=+94.509171888 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911298 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911364 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911347652 +0000 UTC m=+94.509280181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: E0320 13:31:50.911384 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:54.911377623 +0000 UTC m=+94.509310152 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918084 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:50 crc kubenswrapper[4755]: I0320 13:31:50.918156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:50Z","lastTransitionTime":"2026-03-20T13:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.012412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.012469 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012542 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012568 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012569 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012581 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012585 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012597 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012638 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:55.012622225 +0000 UTC m=+94.610554755 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.012669 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:31:55.012663697 +0000 UTC m=+94.610596226 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.021876 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124948 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124967 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.124979 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.225488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.225506 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.225803 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:51 crc kubenswrapper[4755]: E0320 13:31:51.225941 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227392 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227461 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.227492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.237561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.249278 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.262888 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.276330 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.289441 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.301050 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.330989 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434735 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434801 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.434871 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538202 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538274 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.538286 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640626 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.640640 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.742968 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846581 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846600 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.846612 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:51 crc kubenswrapper[4755]: I0320 13:31:51.950207 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:51Z","lastTransitionTime":"2026-03-20T13:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.053419 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.156465 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.225355 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:52 crc kubenswrapper[4755]: E0320 13:31:52.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259240 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.259252 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.361585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.464829 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568061 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568165 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568193 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.568213 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671481 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.671524 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.775347 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878860 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878927 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.878984 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.981955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982025 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982046 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:52 crc kubenswrapper[4755]: I0320 13:31:52.982103 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:52Z","lastTransitionTime":"2026-03-20T13:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.085296 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.188627 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.225478 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.225561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:53 crc kubenswrapper[4755]: E0320 13:31:53.225706 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:53 crc kubenswrapper[4755]: E0320 13:31:53.225833 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.291930 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.291986 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.292040 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.394958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395053 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395106 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.395126 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498111 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.498222 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601391 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.601431 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.704600 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808548 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808632 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808681 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.808728 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911833 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911937 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:53 crc kubenswrapper[4755]: I0320 13:31:53.911954 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:53Z","lastTransitionTime":"2026-03-20T13:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014550 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014621 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.014688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118278 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.118389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221287 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221322 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.221343 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.225574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.225773 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.243905 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.244005 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.244268 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324307 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.324455 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428304 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428384 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.428446 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.532352 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634781 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.634796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.635143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.635188 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.653043 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.653199 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.740959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741018 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.741068 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844163 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844184 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.844197 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950789 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950808 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.950819 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:54Z","lastTransitionTime":"2026-03-20T13:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955052 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:54 crc kubenswrapper[4755]: I0320 13:31:54.955218 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955321 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955353 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955315 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955273882 +0000 UTC m=+102.553206411 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955467 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955420576 +0000 UTC m=+102.553353205 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:31:54 crc kubenswrapper[4755]: E0320 13:31:54.955514 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:02.955493638 +0000 UTC m=+102.553426397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054027 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.054151 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.056561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.056637 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056877 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056954 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056881 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.056986 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057017 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057042 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057099 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:03.057069239 +0000 UTC m=+102.655001948 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.057138 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:03.0571216 +0000 UTC m=+102.655054389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158176 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.158200 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.225431 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.225433 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.225598 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:55 crc kubenswrapper[4755]: E0320 13:31:55.225700 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260596 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260630 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.260687 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363552 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.363570 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466903 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466916 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466938 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.466952 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.570179 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673649 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673708 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.673766 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.776244 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879680 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.879691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984612 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:55 crc kubenswrapper[4755]: I0320 13:31:55.984688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:55Z","lastTransitionTime":"2026-03-20T13:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087172 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.087263 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190726 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190753 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.190775 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.225220 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:56 crc kubenswrapper[4755]: E0320 13:31:56.225440 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292984 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.292993 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.293012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.293025 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.395105 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.497940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.497999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498013 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498036 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.498051 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.600989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.601006 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704168 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.704185 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.807772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911405 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:56 crc kubenswrapper[4755]: I0320 13:31:56.911530 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:56Z","lastTransitionTime":"2026-03-20T13:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.014398 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117540 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117563 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.117576 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220444 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220592 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.220621 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.224873 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.224887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:57 crc kubenswrapper[4755]: E0320 13:31:57.225169 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:57 crc kubenswrapper[4755]: E0320 13:31:57.225487 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.324627 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428377 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428434 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.428492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531830 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.531848 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634689 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634756 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.634824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738064 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.738225 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842361 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842412 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842421 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.842451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:57 crc kubenswrapper[4755]: I0320 13:31:57.946585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:57Z","lastTransitionTime":"2026-03-20T13:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.050184 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.153995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154092 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.154107 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.224744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.224936 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257142 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257204 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.257266 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360676 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.360710 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464398 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464409 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464428 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.464441 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.567588 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.670982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671051 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671064 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.671101 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.774208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833584 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833709 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.833727 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.846705 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851858 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851961 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.851977 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.872939 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878611 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878715 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878790 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.878824 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.896465 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902874 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.902996 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.921806 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927301 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927334 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.927389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.943565 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:31:58 crc kubenswrapper[4755]: E0320 13:31:58.943787 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:58 crc kubenswrapper[4755]: I0320 13:31:58.946241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:58Z","lastTransitionTime":"2026-03-20T13:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048891 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048931 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.048945 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151675 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.151688 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.225727 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.225758 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:31:59 crc kubenswrapper[4755]: E0320 13:31:59.225947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:31:59 crc kubenswrapper[4755]: E0320 13:31:59.226000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253772 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253898 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.253920 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357578 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357700 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357722 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357749 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.357767 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462850 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462885 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.462909 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566310 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566407 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.566451 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669491 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.669544 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773219 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773284 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773302 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.773314 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.875965 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.876170 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.979926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980169 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980187 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980214 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:31:59 crc kubenswrapper[4755]: I0320 13:31:59.980234 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:31:59Z","lastTransitionTime":"2026-03-20T13:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.084397 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187551 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187571 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187599 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.187619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.224838 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:00 crc kubenswrapper[4755]: E0320 13:32:00.225021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290592 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290678 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290702 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.290714 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.394356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497769 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497800 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.497863 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.600994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704295 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704313 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.704365 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808439 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808453 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808476 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.808491 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912359 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912395 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:00 crc kubenswrapper[4755]: I0320 13:32:00.912419 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:00Z","lastTransitionTime":"2026-03-20T13:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014823 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014892 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014913 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.014927 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.117952 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118020 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118038 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.118083 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221822 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.221997 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.225295 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.225349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:01 crc kubenswrapper[4755]: E0320 13:32:01.225916 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:01 crc kubenswrapper[4755]: E0320 13:32:01.225458 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.246727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.262382 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.278447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.294491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.307562 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.320865 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324670 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.324739 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.337848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428399 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428422 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.428437 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532017 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532085 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532104 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.532151 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.635682 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739360 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.739456 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.842954 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843109 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.843133 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:01 crc kubenswrapper[4755]: I0320 13:32:01.946220 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:01Z","lastTransitionTime":"2026-03-20T13:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.048969 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049044 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049058 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.049091 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.152998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153080 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.153155 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.225456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.226000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228100 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:32:02 crc kubenswrapper[4755]: else Mar 20 13:32:02 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:32:02 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228290 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.228618 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:02 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:32:02 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:32:02 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:32:02 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:32:02 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:32:02 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:32:02 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:32:02 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:32:02 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:32:02 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:32:02 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.229788 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.229840 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.230822 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:02 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:02 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:02 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:02 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:02 crc kubenswrapper[4755]: fi Mar 20 13:32:02 crc kubenswrapper[4755]: Mar 20 13:32:02 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:32:02 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:02 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:32:02 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:32:02 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:02 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:02 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:02 crc kubenswrapper[4755]: E0320 13:32:02.232048 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.257945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258047 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258079 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.258105 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361776 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361803 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.361821 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464896 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.464942 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568075 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568139 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568156 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.568202 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672403 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672452 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.672487 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.776181 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879114 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879191 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879239 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.879261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.982940 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983019 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:02 crc kubenswrapper[4755]: I0320 13:32:02.983088 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:02Z","lastTransitionTime":"2026-03-20T13:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.039946 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.040092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.040180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040260 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040224509 +0000 UTC m=+118.638157048 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040305 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040313 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040394 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040372133 +0000 UTC m=+118.638304692 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.040429 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.040407954 +0000 UTC m=+118.638340523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086569 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086593 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.086642 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.141560 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.141751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.141953 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142000 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142018 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142037 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142082 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142040 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142160 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.14213409 +0000 UTC m=+118.740066649 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.142305 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.142264583 +0000 UTC m=+118.740197162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.193348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194222 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194289 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.194312 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.224918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.225038 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.225122 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:03 crc kubenswrapper[4755]: E0320 13:32:03.225260 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297792 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297943 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.297964 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.401539 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504489 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.504638 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.607999 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608069 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.608136 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.711295 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.814848 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815024 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815049 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815083 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.815104 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918300 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918390 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:03 crc kubenswrapper[4755]: I0320 13:32:03.918404 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:03Z","lastTransitionTime":"2026-03-20T13:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021283 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.021416 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125196 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125264 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125281 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.125356 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.225078 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:04 crc kubenswrapper[4755]: E0320 13:32:04.225331 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227824 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227854 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.227869 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330597 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330714 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.330810 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434206 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.434255 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537646 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.537691 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641426 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.641529 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.744402 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848108 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848178 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.848242 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:04 crc kubenswrapper[4755]: I0320 13:32:04.951858 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:04Z","lastTransitionTime":"2026-03-20T13:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054828 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.054846 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158588 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.158618 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.225599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.225778 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:05 crc kubenswrapper[4755]: E0320 13:32:05.225985 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.226078 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:32:05 crc kubenswrapper[4755]: E0320 13:32:05.226184 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262367 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.262478 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364869 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364924 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.364969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468149 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468473 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468485 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.468515 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.575840 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576767 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576793 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.576845 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680512 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680619 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680716 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.680751 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.685619 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.688162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.688634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.708810 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.723077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.739499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.762742 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.780472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783883 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.783992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.784011 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.798324 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.813011 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887579 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.887642 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991102 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:05 crc kubenswrapper[4755]: I0320 13:32:05.991186 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:05Z","lastTransitionTime":"2026-03-20T13:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.007154 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zf67p"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.007864 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.010513 4755 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.010569 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.010607 4755 reflector.go:561] object-"openshift-dns"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.010733 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.013265 4755 reflector.go:561] object-"openshift-dns"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.013329 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.030273 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.040292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.053997 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.067052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.093994 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094086 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094138 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.094156 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.114253 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.139789 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.159476 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.171441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.171472 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.172362 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197186 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.197230 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.225258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.225455 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272430 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.272579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/378696d3-72aa-4101-9746-a2b0d203f525-hosts-file\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299370 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299423 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299469 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.299490 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.399185 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-dmzsb"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.400300 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402301 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402605 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-8btvn"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402872 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402936 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.402981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403003 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403075 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xmn6s"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403324 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.403570 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.405118 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.405252 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.406067 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.406283 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410141 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410693 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410791 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.410976 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412504 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.412502 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.423258 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.439578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.444827 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.455831 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.465792 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.479215 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.491345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506385 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.506399 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.507421 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.523465 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.538607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.552317 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.565030 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577083 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577142 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577822 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577862 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577910 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.577936 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578102 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578327 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578449 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578606 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578724 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578785 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.578945 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579063 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579270 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579434 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579543 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579607 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579700 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.579764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.581333 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.597257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609796 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609911 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.609932 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.614062 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.627708 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.642891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.665785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.680967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681100 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-k8s-cni-cncf-io\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681202 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681221 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-cnibin\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681395 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-etc-kubernetes\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681442 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/3eb406f6-1a26-4eea-84ac-e55f5232900c-rootfs\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681455 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681863 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681905 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682020 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-binary-copy\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-conf-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682389 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-netns\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-bin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.681175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-system-cni-dir\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-os-release\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682552 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aa13631f-58da-4411-8e94-2385741a977e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682637 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-kubelet\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/3eb406f6-1a26-4eea-84ac-e55f5232900c-mcd-auth-proxy-config\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682839 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-os-release\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682906 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-var-lib-cni-multus\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.682986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-system-cni-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-cnibin\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-cni-binary-copy\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-hostroot\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-host-run-multus-certs\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-daemon-config\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683929 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aa13631f-58da-4411-8e94-2385741a977e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.683957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e5ba4f17-8c41-4124-b563-01d5f1751139-multus-socket-dir-parent\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.686694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.693097 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3eb406f6-1a26-4eea-84ac-e55f5232900c-proxy-tls\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.702053 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.709684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9w8l\" (UniqueName: \"kubernetes.io/projected/e5ba4f17-8c41-4124-b563-01d5f1751139-kube-api-access-p9w8l\") pod \"multus-8btvn\" (UID: \"e5ba4f17-8c41-4124-b563-01d5f1751139\") " pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715102 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715445 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.715535 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.717313 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lctxm\" (UniqueName: \"kubernetes.io/projected/aa13631f-58da-4411-8e94-2385741a977e-kube-api-access-lctxm\") pod \"multus-additional-cni-plugins-dmzsb\" (UID: \"aa13631f-58da-4411-8e94-2385741a977e\") " pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.718139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rknk9\" (UniqueName: \"kubernetes.io/projected/3eb406f6-1a26-4eea-84ac-e55f5232900c-kube-api-access-rknk9\") pod \"machine-config-daemon-xmn6s\" (UID: \"3eb406f6-1a26-4eea-84ac-e55f5232900c\") " pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.727339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.739206 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8btvn" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.742725 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa13631f_58da_4411_8e94_2385741a977e.slice/crio-bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f WatchSource:0}: Error finding container bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f: Status 404 returned error can't find the container with id bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.746791 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lctxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dmzsb_openshift-multus(aa13631f-58da-4411-8e94-2385741a977e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.748075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podUID="aa13631f-58da-4411-8e94-2385741a977e" Mar 20 13:32:06 crc kubenswrapper[4755]: W0320 13:32:06.750018 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ba4f17_8c41_4124_b563_01d5f1751139.slice/crio-3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398 WatchSource:0}: Error finding container 3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398: Status 404 returned error can't find the container with id 3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398 Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.750298 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.753808 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:06 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 13:32:06 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 13:32:06 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9w8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:06 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.755000 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.771339 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.774771 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:06 crc kubenswrapper[4755]: E0320 13:32:06.776072 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.792706 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.793919 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.799035 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800674 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800722 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800798 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.800959 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.802104 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.804338 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.813861 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818766 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818847 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.818865 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.829580 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.844238 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.860886 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.877047 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884468 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884801 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884916 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884938 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.884960 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.885022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.885045 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.891909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.906778 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921761 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.921783 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:06Z","lastTransitionTime":"2026-03-20T13:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.924273 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.942559 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.960334 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.984313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986549 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986708 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986916 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.986946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987006 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987073 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987196 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987313 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987371 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987924 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.987996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988165 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988209 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.988731 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.989004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.989082 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.993119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:06 crc kubenswrapper[4755]: I0320 13:32:06.995892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.006119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"ovnkube-node-bd25w\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025511 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.025597 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.118732 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129063 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129123 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.129169 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.137364 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 13:32:07 crc kubenswrapper[4755]: apiVersion: v1 Mar 20 13:32:07 crc kubenswrapper[4755]: clusters: Mar 20 13:32:07 crc kubenswrapper[4755]: - cluster: Mar 20 13:32:07 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 13:32:07 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: contexts: Mar 20 13:32:07 crc kubenswrapper[4755]: - context: Mar 20 13:32:07 crc kubenswrapper[4755]: cluster: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: namespace: default Mar 20 13:32:07 crc kubenswrapper[4755]: user: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: current-context: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: kind: Config Mar 20 13:32:07 crc kubenswrapper[4755]: preferences: {} Mar 20 13:32:07 crc kubenswrapper[4755]: users: Mar 20 13:32:07 crc kubenswrapper[4755]: - name: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: user: Mar 20 13:32:07 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: EOF Mar 20 13:32:07 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlq8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.139489 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.225743 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.226074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.226303 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.226549 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231537 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.231552 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.244350 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.287249 4755 projected.go:288] Couldn't get configMap openshift-dns/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.326236 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334595 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334731 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334759 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.334778 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.425600 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.425600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.428037 4755 projected.go:194] Error preparing data for projected volume kube-api-access-h5qsm for pod openshift-dns/node-resolver-zf67p: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.428170 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm podName:378696d3-72aa-4101-9746-a2b0d203f525 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:07.928137109 +0000 UTC m=+107.526069668 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-h5qsm" (UniqueName: "kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm") pod "node-resolver-zf67p" (UID: "378696d3-72aa-4101-9746-a2b0d203f525") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438070 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.438101 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540915 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.540997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.541048 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.541067 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644190 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644199 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644215 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.644229 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.701962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"bc1578c1551e3d2beb3ccd0909ef120d34d51a4fb1339fa4c8fa2132312f289f"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.704153 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lctxm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-dmzsb_openshift-multus(aa13631f-58da-4411-8e94-2385741a977e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.704465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"c5dfe5e0ba9e4e073084c039346a869cdace2560fac63c02b23de7cad0ed5e4a"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.706326 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podUID="aa13631f-58da-4411-8e94-2385741a977e" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.706366 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 13:32:07 crc kubenswrapper[4755]: apiVersion: v1 Mar 20 13:32:07 crc kubenswrapper[4755]: clusters: Mar 20 13:32:07 crc kubenswrapper[4755]: - cluster: Mar 20 13:32:07 crc kubenswrapper[4755]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 13:32:07 crc kubenswrapper[4755]: server: https://api-int.crc.testing:6443 Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: contexts: Mar 20 13:32:07 crc kubenswrapper[4755]: - context: Mar 20 13:32:07 crc kubenswrapper[4755]: cluster: default-cluster Mar 20 13:32:07 crc kubenswrapper[4755]: namespace: default Mar 20 13:32:07 crc kubenswrapper[4755]: user: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: name: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: current-context: default-context Mar 20 13:32:07 crc kubenswrapper[4755]: kind: Config Mar 20 13:32:07 crc kubenswrapper[4755]: preferences: {} Mar 20 13:32:07 crc kubenswrapper[4755]: users: Mar 20 13:32:07 crc kubenswrapper[4755]: - name: default-auth Mar 20 13:32:07 crc kubenswrapper[4755]: user: Mar 20 13:32:07 crc kubenswrapper[4755]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:32:07 crc kubenswrapper[4755]: EOF Mar 20 13:32:07 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlq8b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.707101 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"a7d0785ead0f22a1ed3bc834bf588ea6bac1d78135d33174225637d2a0f4afac"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.707677 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.708564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"3db492a10eb5fd8f5b722cc6766af01ea865ae90bcbad81d2c14eae4adfb0398"} Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.709090 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.711944 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rknk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.712124 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:07 crc kubenswrapper[4755]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 13:32:07 crc kubenswrapper[4755]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 13:32:07 crc kubenswrapper[4755]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p9w8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:07 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.714191 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:32:07 crc kubenswrapper[4755]: E0320 13:32:07.714243 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.716365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.736537 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746778 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746811 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746820 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746836 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.746847 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.747741 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.757909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.768360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.778406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.790960 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.817301 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.833454 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.847460 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849582 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.849598 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.863017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.875746 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.891810 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.908023 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.921957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.935807 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953396 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.953513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:07Z","lastTransitionTime":"2026-03-20T13:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.955314 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.973954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.986550 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.997204 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:07 crc kubenswrapper[4755]: I0320 13:32:07.999055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.004793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5qsm\" (UniqueName: \"kubernetes.io/projected/378696d3-72aa-4101-9746-a2b0d203f525-kube-api-access-h5qsm\") pod \"node-resolver-zf67p\" (UID: \"378696d3-72aa-4101-9746-a2b0d203f525\") " pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.011188 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.024787 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.040297 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.056929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057259 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.057625 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.085764 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.100432 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.128739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zf67p" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.156560 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:08 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:08 crc kubenswrapper[4755]: set -uo pipefail Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 13:32:08 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 20 13:32:08 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 20 13:32:08 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 20 13:32:08 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: while true; do Mar 20 13:32:08 crc kubenswrapper[4755]: declare -A svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 13:32:08 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 13:32:08 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 13:32:08 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 13:32:08 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 13:32:08 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 20 13:32:08 crc kubenswrapper[4755]: do Mar 20 13:32:08 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 20 13:32:08 crc kubenswrapper[4755]: break Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 20 13:32:08 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 13:32:08 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 13:32:08 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Append resolver entries for services Mar 20 13:32:08 crc kubenswrapper[4755]: rc=0 Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 20 13:32:08 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 13:32:08 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 20 13:32:08 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 13:32:08 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: unset svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5qsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zf67p_openshift-dns(378696d3-72aa-4101-9746-a2b0d203f525): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:08 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.158127 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zf67p" podUID="378696d3-72aa-4101-9746-a2b0d203f525" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.159990 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160042 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160062 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.160124 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.224978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.225198 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.263960 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366744 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.366767 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471130 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471205 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471223 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471256 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.471281 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574736 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574814 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574837 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.574850 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678611 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678762 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.678821 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.714390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf67p" event={"ID":"378696d3-72aa-4101-9746-a2b0d203f525","Type":"ContainerStarted","Data":"a6a1240d78c23b729861be2ba99389d3cedd2920dff26b255d3ba2e8d5584aba"} Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.717038 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:08 crc kubenswrapper[4755]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:08 crc kubenswrapper[4755]: set -uo pipefail Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 13:32:08 crc kubenswrapper[4755]: HOSTS_FILE="/etc/hosts" Mar 20 13:32:08 crc kubenswrapper[4755]: TEMP_FILE="/etc/hosts.tmp" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Make a temporary file with the old hosts file's attributes. Mar 20 13:32:08 crc kubenswrapper[4755]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: echo "Failed to preserve hosts file. Exiting." Mar 20 13:32:08 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: while true; do Mar 20 13:32:08 crc kubenswrapper[4755]: declare -A svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${services[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 13:32:08 crc kubenswrapper[4755]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 13:32:08 crc kubenswrapper[4755]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 13:32:08 crc kubenswrapper[4755]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 13:32:08 crc kubenswrapper[4755]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:32:08 crc kubenswrapper[4755]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 13:32:08 crc kubenswrapper[4755]: for i in ${!cmds[*]} Mar 20 13:32:08 crc kubenswrapper[4755]: do Mar 20 13:32:08 crc kubenswrapper[4755]: ips=($(eval "${cmds[i]}")) Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: svc_ips["${svc}"]="${ips[@]}" Mar 20 13:32:08 crc kubenswrapper[4755]: break Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Update /etc/hosts only if we get valid service IPs Mar 20 13:32:08 crc kubenswrapper[4755]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 13:32:08 crc kubenswrapper[4755]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 13:32:08 crc kubenswrapper[4755]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 13:32:08 crc kubenswrapper[4755]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # Append resolver entries for services Mar 20 13:32:08 crc kubenswrapper[4755]: rc=0 Mar 20 13:32:08 crc kubenswrapper[4755]: for svc in "${!svc_ips[@]}"; do Mar 20 13:32:08 crc kubenswrapper[4755]: for ip in ${svc_ips[${svc}]}; do Mar 20 13:32:08 crc kubenswrapper[4755]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: if [[ $rc -ne 0 ]]; then Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: continue Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: Mar 20 13:32:08 crc kubenswrapper[4755]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 13:32:08 crc kubenswrapper[4755]: # Replace /etc/hosts with our modified version if needed Mar 20 13:32:08 crc kubenswrapper[4755]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 13:32:08 crc kubenswrapper[4755]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 13:32:08 crc kubenswrapper[4755]: fi Mar 20 13:32:08 crc kubenswrapper[4755]: sleep 60 & wait Mar 20 13:32:08 crc kubenswrapper[4755]: unset svc_ips Mar 20 13:32:08 crc kubenswrapper[4755]: done Mar 20 13:32:08 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h5qsm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-zf67p_openshift-dns(378696d3-72aa-4101-9746-a2b0d203f525): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:08 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:08 crc kubenswrapper[4755]: E0320 13:32:08.718451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-zf67p" podUID="378696d3-72aa-4101-9746-a2b0d203f525" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.729089 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.753269 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.764417 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.773640 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782065 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782446 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.782525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.785542 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.796957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.810491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.825829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.842945 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.856910 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.874614 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885386 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.885492 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.892444 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.903259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988829 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988907 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:08 crc kubenswrapper[4755]: I0320 13:32:08.988980 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:08Z","lastTransitionTime":"2026-03-20T13:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025375 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025459 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025482 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.025496 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.037698 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042255 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.042385 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.056522 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061730 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061774 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.061825 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.078811 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083629 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083641 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.083694 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.099291 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104354 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104705 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104865 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.104994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.119179 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.119611 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121920 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121970 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.121989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.122015 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.122035 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.224552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.224617 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.224730 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:09 crc kubenswrapper[4755]: E0320 13:32:09.224904 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225458 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225525 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225538 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.225579 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.328362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.328867 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.329341 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.431640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.431991 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432078 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432164 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.432253 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.536241 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639120 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639185 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639207 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639238 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.639259 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742291 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.742935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.743090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.743230 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846810 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846887 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846914 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.846972 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950269 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:09 crc kubenswrapper[4755]: I0320 13:32:09.950406 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:09Z","lastTransitionTime":"2026-03-20T13:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.053684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054261 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054503 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.054585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.158529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159161 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.159525 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.224911 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:10 crc kubenswrapper[4755]: E0320 13:32:10.225126 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262835 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262900 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.262970 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365838 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.365942 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.366016 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469902 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469926 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.469982 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574352 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574382 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.574402 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678339 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678437 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.678511 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782110 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.782864 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.783380 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.783609 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.887862 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888544 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.888976 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992554 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992687 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992813 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:10 crc kubenswrapper[4755]: I0320 13:32:10.992883 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:10Z","lastTransitionTime":"2026-03-20T13:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.095909 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.095982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096000 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096028 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.096048 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199839 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.199862 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.225572 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.225881 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:11 crc kubenswrapper[4755]: E0320 13:32:11.226021 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:11 crc kubenswrapper[4755]: E0320 13:32:11.225851 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.246013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.274129 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.293067 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303606 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303633 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.303704 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.312274 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.327306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.343578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.353925 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.380214 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.392073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407040 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407133 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407170 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407211 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.407236 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.409540 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.423327 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.435795 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.449234 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510468 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510488 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510513 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.510533 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613021 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613089 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613107 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613134 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.613154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717188 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717266 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717288 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.717347 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820590 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820669 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820695 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.820708 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923773 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:11 crc kubenswrapper[4755]: I0320 13:32:11.923812 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:11Z","lastTransitionTime":"2026-03-20T13:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027504 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027586 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.027619 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131463 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.131565 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.225257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:12 crc kubenswrapper[4755]: E0320 13:32:12.225510 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234336 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.234382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337122 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337194 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337241 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.337261 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441097 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441216 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.441234 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544406 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544424 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544457 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.544484 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647523 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647531 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647546 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.647556 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.686921 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-b9bt2"] Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.687490 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691890 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.692062 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.691906 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.724588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.739786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751247 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.751332 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.758262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.771202 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.781914 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.797864 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.812389 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.828947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.849394 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852451 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852552 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.852725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855479 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855526 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.855548 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.880991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.902064 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.922317 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.938744 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.952848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953700 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.953944 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/741d8a76-423b-4e13-aedb-fff0e87a207c-host\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.957836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/741d8a76-423b-4e13-aedb-fff0e87a207c-serviceca\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959539 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959712 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.959741 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:12Z","lastTransitionTime":"2026-03-20T13:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:12 crc kubenswrapper[4755]: I0320 13:32:12.985187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4m4w\" (UniqueName: \"kubernetes.io/projected/741d8a76-423b-4e13-aedb-fff0e87a207c-kube-api-access-w4m4w\") pod \"node-ca-b9bt2\" (UID: \"741d8a76-423b-4e13-aedb-fff0e87a207c\") " pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.010761 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-b9bt2" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.032932 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:13 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 13:32:13 crc kubenswrapper[4755]: while [ true ]; Mar 20 13:32:13 crc kubenswrapper[4755]: do Mar 20 13:32:13 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $f Mar 20 13:32:13 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: else Mar 20 13:32:13 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 20 13:32:13 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $d Mar 20 13:32:13 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4m4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-b9bt2_openshift-image-registry(741d8a76-423b-4e13-aedb-fff0e87a207c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:13 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.034629 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-b9bt2" podUID="741d8a76-423b-4e13-aedb-fff0e87a207c" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.062951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063010 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063045 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.063112 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166197 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166290 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166319 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.166338 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.224917 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.225049 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.225144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.225267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269785 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.269948 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373143 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373311 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.373331 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476564 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476711 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.476758 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579480 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579558 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579576 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579605 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.579622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682514 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682541 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.682559 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.733953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9bt2" event={"ID":"741d8a76-423b-4e13-aedb-fff0e87a207c","Type":"ContainerStarted","Data":"8b589ffb4c4049b586dc6be51e4e22a1e3a85b7d86d156bc885232694332d18a"} Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.737309 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:13 crc kubenswrapper[4755]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 13:32:13 crc kubenswrapper[4755]: while [ true ]; Mar 20 13:32:13 crc kubenswrapper[4755]: do Mar 20 13:32:13 crc kubenswrapper[4755]: for f in $(ls /tmp/serviceca); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $f Mar 20 13:32:13 crc kubenswrapper[4755]: ca_file_path="/tmp/serviceca/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ -e "${reg_dir_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: else Mar 20 13:32:13 crc kubenswrapper[4755]: mkdir $reg_dir_path Mar 20 13:32:13 crc kubenswrapper[4755]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: for d in $(ls /etc/docker/certs.d); do Mar 20 13:32:13 crc kubenswrapper[4755]: echo $d Mar 20 13:32:13 crc kubenswrapper[4755]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 13:32:13 crc kubenswrapper[4755]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 13:32:13 crc kubenswrapper[4755]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 13:32:13 crc kubenswrapper[4755]: rm -rf /etc/docker/certs.d/$d Mar 20 13:32:13 crc kubenswrapper[4755]: fi Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: sleep 60 & wait ${!} Mar 20 13:32:13 crc kubenswrapper[4755]: done Mar 20 13:32:13 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w4m4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-b9bt2_openshift-image-registry(741d8a76-423b-4e13-aedb-fff0e87a207c): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:13 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:13 crc kubenswrapper[4755]: E0320 13:32:13.738863 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-b9bt2" podUID="741d8a76-423b-4e13-aedb-fff0e87a207c" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.745472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.756217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.773644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.785877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787598 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787642 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.787673 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.798965 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.818707 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.837004 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.852911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.872218 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.885004 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.890983 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891234 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.891510 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.899220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.918801 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.938999 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.953367 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.996982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997136 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997330 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:13 crc kubenswrapper[4755]: I0320 13:32:13.997434 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:13Z","lastTransitionTime":"2026-03-20T13:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100276 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.100382 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203575 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203585 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.203622 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.224874 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:14 crc kubenswrapper[4755]: E0320 13:32:14.225043 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307286 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307431 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.307443 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.410980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.411002 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514754 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514788 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.514810 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618703 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618720 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618743 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.618757 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722560 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722617 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722684 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.722702 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826385 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826449 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826466 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.826524 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929944 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929958 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929981 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:14 crc kubenswrapper[4755]: I0320 13:32:14.929993 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:14Z","lastTransitionTime":"2026-03-20T13:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033297 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033348 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033369 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.033381 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.135997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136060 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136076 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136100 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.136117 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.225515 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.225625 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:15 crc kubenswrapper[4755]: E0320 13:32:15.225747 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:15 crc kubenswrapper[4755]: E0320 13:32:15.225835 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239721 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.239745 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342851 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342919 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.342996 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447477 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447561 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447614 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.447630 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551131 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551209 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551228 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.551275 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654876 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654893 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654923 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.654944 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759381 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.759400 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862353 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862486 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.862585 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965436 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965505 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965547 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:15 crc kubenswrapper[4755]: I0320 13:32:15.965564 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:15Z","lastTransitionTime":"2026-03-20T13:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069227 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069305 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069331 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069366 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.069389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174023 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174101 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174150 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.174170 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.224762 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.225114 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.227793 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:16 crc kubenswrapper[4755]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:32:16 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:16 crc kubenswrapper[4755]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:32:16 crc kubenswrapper[4755]: source /etc/kubernetes/apiserver-url.env Mar 20 13:32:16 crc kubenswrapper[4755]: else Mar 20 13:32:16 crc kubenswrapper[4755]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:32:16 crc kubenswrapper[4755]: exit 1 Mar 20 13:32:16 crc kubenswrapper[4755]: fi Mar 20 13:32:16 crc kubenswrapper[4755]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:32:16 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:16 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:16 crc kubenswrapper[4755]: E0320 13:32:16.229491 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277318 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.277346 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381945 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381982 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.381994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485328 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485346 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485373 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.485389 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588324 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588432 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588462 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.588481 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691467 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.691514 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795859 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795921 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795956 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.795969 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898725 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898745 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898775 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:16 crc kubenswrapper[4755]: I0320 13:32:16.898796 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:16Z","lastTransitionTime":"2026-03-20T13:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002934 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.002977 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105807 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105871 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105905 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.105924 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208886 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.208976 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.209004 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.209020 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.225349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.225374 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.225550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.226022 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.227256 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.227771 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:17 crc kubenswrapper[4755]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:17 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:17 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:17 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: fi Mar 20 13:32:17 crc kubenswrapper[4755]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:32:17 crc kubenswrapper[4755]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:32:17 crc kubenswrapper[4755]: ho_enable="--enable-hybrid-overlay" Mar 20 13:32:17 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:32:17 crc kubenswrapper[4755]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:32:17 crc kubenswrapper[4755]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:32:17 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-host=127.0.0.1 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --webhook-port=9743 \ Mar 20 13:32:17 crc kubenswrapper[4755]: ${ho_enable} \ Mar 20 13:32:17 crc kubenswrapper[4755]: --enable-interconnect \ Mar 20 13:32:17 crc kubenswrapper[4755]: --disable-approver \ Mar 20 13:32:17 crc kubenswrapper[4755]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --wait-for-kubernetes-api=200s \ Mar 20 13:32:17 crc kubenswrapper[4755]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:17 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:17 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.228367 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.229803 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:32:17 crc kubenswrapper[4755]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:32:17 crc kubenswrapper[4755]: if [[ -f "/env/_master" ]]; then Mar 20 13:32:17 crc kubenswrapper[4755]: set -o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: source "/env/_master" Mar 20 13:32:17 crc kubenswrapper[4755]: set +o allexport Mar 20 13:32:17 crc kubenswrapper[4755]: fi Mar 20 13:32:17 crc kubenswrapper[4755]: Mar 20 13:32:17 crc kubenswrapper[4755]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:32:17 crc kubenswrapper[4755]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:32:17 crc kubenswrapper[4755]: --disable-webhook \ Mar 20 13:32:17 crc kubenswrapper[4755]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:32:17 crc kubenswrapper[4755]: --loglevel="${LOGLEVEL}" Mar 20 13:32:17 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:32:17 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:32:17 crc kubenswrapper[4755]: E0320 13:32:17.231051 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312387 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312745 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.312955 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.415941 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416338 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416694 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.416858 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520154 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520237 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520251 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520273 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.520289 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.623701 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624440 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624693 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.624890 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728358 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728506 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728567 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.728593 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831889 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831953 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.831971 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.832006 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.832028 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936484 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936496 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936518 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:17 crc kubenswrapper[4755]: I0320 13:32:17.936530 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:17Z","lastTransitionTime":"2026-03-20T13:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040267 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040899 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.040980 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.041001 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145141 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.145989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.146154 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.224907 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:18 crc kubenswrapper[4755]: E0320 13:32:18.225135 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249374 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249433 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.249498 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352306 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352389 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352414 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.352431 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456056 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456140 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456159 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456189 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.456208 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.548104 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn"] Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.548869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.551508 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.553315 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558957 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558973 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.558997 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.559017 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.568272 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.583302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.597885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.616386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.619999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620192 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.620248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.627891 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.640379 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.651855 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.661947 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662022 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662037 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662066 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.662080 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.666132 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.682560 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.696405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.713941 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721816 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721890 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.721981 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.723011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.723329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.731527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/18256fa3-a343-4dc3-8c00-f6f5de000b4b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.742181 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcct2\" (UniqueName: \"kubernetes.io/projected/18256fa3-a343-4dc3-8c00-f6f5de000b4b-kube-api-access-dcct2\") pod \"ovnkube-control-plane-749d76644c-4mcbn\" (UID: \"18256fa3-a343-4dc3-8c00-f6f5de000b4b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.743252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.753041 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.758510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.772089 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774236 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774337 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774363 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.774379 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.785452 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.798848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.813607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.826610 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.840010 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.853439 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.862370 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.865456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.874588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878157 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878210 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878249 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878275 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.878292 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.897884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.912502 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.930211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.941112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.951919 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.971186 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981329 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981478 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981519 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981535 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981555 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:18 crc kubenswrapper[4755]: I0320 13:32:18.981565 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:18Z","lastTransitionTime":"2026-03-20T13:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.007366 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.018526 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084706 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084746 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084755 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084771 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.084781 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125714 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125799 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.125836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.125980 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126037 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.126020508 +0000 UTC m=+150.723953057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126107 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.12609583 +0000 UTC m=+150.724028359 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126196 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.126290 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.126269245 +0000 UTC m=+150.724201774 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188741 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188831 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188849 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188880 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.188900 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.225074 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.225127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.225307 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226152 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.226308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.226394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226495 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226535 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226556 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226595 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226624 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226634 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.226607032 +0000 UTC m=+150.824539731 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226645 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.226790 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.226766027 +0000 UTC m=+150.824698596 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.279555 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.282093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.282170 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295280 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295321 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295332 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295349 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.295362 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.307336 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.319235 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.328014 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.328265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.336979 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.351191 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.363407 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.380696 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.389762 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398031 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398072 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398088 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398112 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.398126 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.402991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.422755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.430555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.430642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.430742 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.430813 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:19.930790993 +0000 UTC m=+119.528723522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.441214 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450216 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24qzh\" (UniqueName: \"kubernetes.io/projected/37d1e037-c169-4932-9928-f3d23ff47c07-kube-api-access-24qzh\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450622 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450698 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.450755 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.455130 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.464693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.466070 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472502 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472515 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.472547 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.481203 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.481510 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485863 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485906 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485929 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485949 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.485959 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.495850 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500263 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500325 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500345 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.500361 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.507246 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.516368 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.518011 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522877 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522908 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522918 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522935 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.522948 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.531960 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.534027 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.534182 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544316 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544527 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544623 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544739 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.544812 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648224 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648254 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648272 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.648284 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751125 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751171 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751183 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751203 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.751217 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753558 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.753607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" event={"ID":"18256fa3-a343-4dc3-8c00-f6f5de000b4b","Type":"ContainerStarted","Data":"9197c20e8553f6422e715ce60f4db4d61ef6e1e55385abe17b9eb996f3107c3b"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.756580 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.756981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.757900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.772265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.792499 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.803048 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.818442 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.832037 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.844698 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854150 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854928 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.854989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855068 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.855088 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.864054 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.880741 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.890700 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.907360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.918460 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.931446 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.938261 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.938469 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: E0320 13:32:19.938548 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:20.938530219 +0000 UTC m=+120.536462768 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.943383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.957571 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:19Z","lastTransitionTime":"2026-03-20T13:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.958277 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.972758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.983483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:19 crc kubenswrapper[4755]: I0320 13:32:19.994735 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.007645 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.020395 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.030780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.039295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.046748 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.057041 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.060917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.060992 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061012 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061493 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.061556 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.070775 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.083107 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.098852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.111930 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.125564 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.134211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.150767 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.159625 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164308 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164351 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164362 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164379 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.164390 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.225432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.225624 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267011 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267057 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267067 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267087 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.267098 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371246 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371320 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371378 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.371401 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474542 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474932 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474951 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474975 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.474994 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578556 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578582 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.578602 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683783 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683856 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683881 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683910 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.683934 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.764771 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca" exitCode=0 Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.764886 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.777758 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787719 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787733 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787758 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.787772 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.790239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.805512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.815081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.824952 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.840842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.862000 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.874938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.886005 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891573 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891604 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891616 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891634 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.891646 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.905916 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.917873 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.930139 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.946687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.950704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.950908 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:20 crc kubenswrapper[4755]: E0320 13:32:20.951005 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:22.950979824 +0000 UTC m=+122.548912433 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.958383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.982419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:20 crc kubenswrapper[4755]: I0320 13:32:20.996576 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999329 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999383 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999418 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:20.999442 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:20Z","lastTransitionTime":"2026-03-20T13:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103179 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103233 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103242 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103260 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.103270 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:21Z","lastTransitionTime":"2026-03-20T13:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.203778 4755 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225108 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.225202 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225350 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225722 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.225959 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.239535 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.249483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.279915 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.293432 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.305357 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.321423 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: E0320 13:32:21.332845 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.339985 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.351158 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.370873 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.388371 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.403049 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.417571 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.432545 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.462586 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.478525 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.497021 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.772349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.779011 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130" exitCode=0 Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.779083 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130"} Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.798529 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.823958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.857016 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.871786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.889756 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.906963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.923527 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.933309 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.964084 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.973947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:21 crc kubenswrapper[4755]: I0320 13:32:21.987165 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.004291 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.012107 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.021080 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.030484 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.039122 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.047581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.068415 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.081912 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.094474 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.106588 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.120548 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.131932 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.142579 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.151454 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.164999 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.178725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.194113 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.233241 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.233797 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.234406 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.255733 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.292752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.325933 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.358856 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.786877 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c" exitCode=0 Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.786961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c"} Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.791564 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" exitCode=0 Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.791705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.823834 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.841882 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.858486 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.871303 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.883173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.896323 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.910081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.924780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.936954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.950834 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.969860 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.973407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.973626 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:22 crc kubenswrapper[4755]: E0320 13:32:22.973768 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:26.973743215 +0000 UTC m=+126.571675834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.987170 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:22 crc kubenswrapper[4755]: I0320 13:32:22.995504 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.007784 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.021295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.033242 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.043456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.083687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.119048 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.157684 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.198341 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.224947 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.225041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.224966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225112 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225211 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:23 crc kubenswrapper[4755]: E0320 13:32:23.225322 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.240972 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.279920 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.315761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.360740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.400564 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.443532 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.477287 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.519365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.560470 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.598017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.643837 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.680447 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.718115 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.802876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zf67p" event={"ID":"378696d3-72aa-4101-9746-a2b0d203f525","Type":"ContainerStarted","Data":"4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808348 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808391 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.808406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.812875 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869" exitCode=0 Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.812947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869"} Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.823752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.839510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.859136 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.878153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.926306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:23 crc kubenswrapper[4755]: I0320 13:32:23.961949 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.003900 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.038959 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.082768 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.118770 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.157512 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.208403 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.225308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:24 crc kubenswrapper[4755]: E0320 13:32:24.225452 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.238914 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.282915 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.319315 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.363520 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.401980 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.442944 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.481211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.526286 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.576821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.600250 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.643242 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.682838 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.717945 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.775486 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.801799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.824225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122"} Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.842410 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.878813 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.922596 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:24 crc kubenswrapper[4755]: I0320 13:32:24.960182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.003252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.045173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.081678 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.117152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.177152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.201950 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225244 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225340 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225448 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225589 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.225712 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:25 crc kubenswrapper[4755]: E0320 13:32:25.225955 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.238121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.277777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.317418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.360554 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.404984 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.441568 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.482989 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.530557 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.573757 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.603223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.642894 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.685514 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.720384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.760607 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.838787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.843023 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122" exitCode=0 Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.843084 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122"} Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.868225 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.878847 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.890800 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.918377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.959163 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:25 crc kubenswrapper[4755]: I0320 13:32:25.997508 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.038937 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.078938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.119315 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.159366 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.211817 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.225040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:26 crc kubenswrapper[4755]: E0320 13:32:26.225306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.239517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.284821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.319239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: E0320 13:32:26.334290 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.360983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.397970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.446187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.856335 4755 generic.go:334] "Generic (PLEG): container finished" podID="aa13631f-58da-4411-8e94-2385741a977e" containerID="a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3" exitCode=0 Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.856519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerDied","Data":"a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3"} Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.860142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-b9bt2" event={"ID":"741d8a76-423b-4e13-aedb-fff0e87a207c","Type":"ContainerStarted","Data":"3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a"} Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.872344 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.891295 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.908958 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.924693 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.943285 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.964622 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:26 crc kubenswrapper[4755]: I0320 13:32:26.989479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.001404 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.020520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.020786 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.020939 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:35.020907794 +0000 UTC m=+134.618840353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.034319 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.056599 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.068216 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.080136 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.095262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.109413 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.121510 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.133752 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.141593 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.162953 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.199885 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225080 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225398 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.225443 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225633 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:27 crc kubenswrapper[4755]: E0320 13:32:27.225732 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.241557 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.280597 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.318616 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.359603 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.401052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.442162 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.477928 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.521971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.564550 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.611062 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.639259 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.683870 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.723465 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.762531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.802851 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.872387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" event={"ID":"aa13631f-58da-4411-8e94-2385741a977e","Type":"ContainerStarted","Data":"7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf"} Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.888475 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.918954 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.934517 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:27 crc kubenswrapper[4755]: I0320 13:32:27.961137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.003966 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.045052 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.079425 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.122307 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.162531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.208750 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.225258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:28 crc kubenswrapper[4755]: E0320 13:32:28.225434 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.246211 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.281778 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.320790 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.370850 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.399361 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.444784 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.484158 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.886568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95"} Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.887152 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.887226 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.910494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.925280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.940280 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.955701 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.968098 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:28 crc kubenswrapper[4755]: I0320 13:32:28.988257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.003572 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.022311 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.039306 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.068623 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.078948 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.091383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.099611 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.109274 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.121619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.135946 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.147841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.162913 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.196498 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.225360 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.225726 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.226019 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.226225 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.226372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.226525 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.240881 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.282342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.319496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.360382 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.398516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.447384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.477354 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.530106 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.560679 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564451 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564492 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564532 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564553 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.564564 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.579422 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585074 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585116 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585128 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585148 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.585161 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.599142 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.599491 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604672 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604713 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.604724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.605059 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.605099 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.619273 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624265 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624312 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624326 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.624357 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.641842 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656471 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656536 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656559 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.656617 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:29Z","lastTransitionTime":"2026-03-20T13:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.669234 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.680830 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: E0320 13:32:29.681080 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.698285 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.717797 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.760558 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.798449 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.836355 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.893113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1"} Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.894901 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.925176 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.948302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.959201 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.970353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:29 crc kubenswrapper[4755]: I0320 13:32:29.998709 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.041069 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.078930 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.121372 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.159711 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.216931 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.225790 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:30 crc kubenswrapper[4755]: E0320 13:32:30.226550 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.237483 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.278502 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.320151 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.363774 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.403149 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.441637 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.480601 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.521176 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.899479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81"} Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.899577 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb"} Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.925221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.944208 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.966377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:30 crc kubenswrapper[4755]: I0320 13:32:30.988353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.005675 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.024604 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.048365 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.063453 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.091604 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.105159 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.120358 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.134982 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.148702 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.162590 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.175140 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.190848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.207469 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.224982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.224993 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.225032 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.225648 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.225812 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.226055 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.245781 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.286738 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.323578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: E0320 13:32:31.334933 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.361728 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.401077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.439976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.479890 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.538340 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.566950 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.607793 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.640318 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.680120 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.721386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.773173 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.802397 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.845470 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.879595 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.926938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.966831 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:31 crc kubenswrapper[4755]: I0320 13:32:31.997892 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.042963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.080600 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.128142 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.160594 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.199124 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.224750 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:32 crc kubenswrapper[4755]: E0320 13:32:32.224891 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.245962 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.283312 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.322464 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.361953 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.401241 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.441864 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.483699 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.520871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.564984 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.911162 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.959366 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" exitCode=1 Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.959457 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95"} Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.960493 4755 scope.go:117] "RemoveContainer" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" Mar 20 13:32:32 crc kubenswrapper[4755]: I0320 13:32:32.995479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.009610 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.021247 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.034227 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.051633 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.071077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.083921 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.099519 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.132243 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.148152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.167262 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.184570 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.200740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.221467 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.224772 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.224910 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.224972 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.225167 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.225399 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:33 crc kubenswrapper[4755]: E0320 13:32:33.225574 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.241935 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.258851 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.276179 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.967819 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.971513 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91"} Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.972184 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:33 crc kubenswrapper[4755]: I0320 13:32:33.994951 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:33Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.014487 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.031602 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.047513 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.062018 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.079777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.099786 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.113480 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.127102 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.141848 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.154265 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.168828 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.184435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.211991 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.226043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:34 crc kubenswrapper[4755]: E0320 13:32:34.226238 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.247785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.259802 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.270956 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:34Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.979853 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.981974 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/0.log" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987218 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" exitCode=1 Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987291 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91"} Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.987361 4755 scope.go:117] "RemoveContainer" containerID="52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95" Mar 20 13:32:34 crc kubenswrapper[4755]: I0320 13:32:34.988599 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:34 crc kubenswrapper[4755]: E0320 13:32:34.988978 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.011137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.021948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.022213 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.022314 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:32:51.022287488 +0000 UTC m=+150.620220057 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.031215 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.066400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.086137 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.106083 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.128471 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.146217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.161008 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.177540 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.199761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.225357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.225481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.225557 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.225745 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.226332 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.226494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: E0320 13:32:35.227121 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.249850 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.269218 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.296121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.333577 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.353871 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.379209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:35Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.994505 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595"} Mar 20 13:32:35 crc kubenswrapper[4755]: I0320 13:32:35.997600 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.003416 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.003671 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.032976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.050006 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.064343 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.080379 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.100025 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.119858 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.138937 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.160852 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.179724 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.216905 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52e0063807b8bac7791c936d1b27de29bacbe502ca53a3a641cfc77c32c71c95\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:32Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:32:32.541238 6713 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:32.541269 6713 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:32.541316 6713 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13:32:32.541362 6713 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 13:32:32.541543 6713 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:32.541593 6713 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:32.541613 6713 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 13:32:32.541722 6713 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:32.541739 6713 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 13:32:32.541767 6713 factory.go:656] Stopping watch factory\\\\nI0320 13:32:32.541784 6713 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:32.541819 6713 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:32.541830 6713 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:32.541838 6713 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 13:32:32.541844 6713 handler.go:208] Removed *v1.Node event handler 2\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.224640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.224836 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.235066 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.261014 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.283620 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.306799 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.327058 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: E0320 13:32:36.337916 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.346573 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.366887 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.386479 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.418740 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.433034 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.447733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.463719 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.476971 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.490386 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.519206 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.545013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.562079 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.576530 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.597469 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.625213 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.640682 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.656346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.692644 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:36 crc kubenswrapper[4755]: I0320 13:32:36.707407 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:36Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224684 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224738 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.224931 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:37 crc kubenswrapper[4755]: I0320 13:32:37.224973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.225149 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:37 crc kubenswrapper[4755]: E0320 13:32:37.225305 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:38 crc kubenswrapper[4755]: I0320 13:32:38.224894 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:38 crc kubenswrapper[4755]: E0320 13:32:38.225154 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225055 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.225895 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225184 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.225171 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.225993 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.226281 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689096 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689181 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689198 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689226 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.689243 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.716957 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723262 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723343 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723371 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723404 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.723433 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.745721 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752570 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752690 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.752706 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.777099 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783127 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783200 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783221 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.783268 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.803360 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809030 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809090 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:39 crc kubenswrapper[4755]: I0320 13:32:39.809163 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:39Z","lastTransitionTime":"2026-03-20T13:32:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.832009 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:39 crc kubenswrapper[4755]: E0320 13:32:39.832253 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:40 crc kubenswrapper[4755]: I0320 13:32:40.225580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:40 crc kubenswrapper[4755]: E0320 13:32:40.225875 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225740 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.225948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226121 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226348 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.226583 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.248435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.277829 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.293911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.308293 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.325286 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: E0320 13:32:41.338375 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.343496 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.363884 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.378019 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.395587 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.411966 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.426426 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.445281 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.471296 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.488773 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.505039 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.526061 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:41 crc kubenswrapper[4755]: I0320 13:32:41.540223 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:42 crc kubenswrapper[4755]: I0320 13:32:42.224631 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:42 crc kubenswrapper[4755]: E0320 13:32:42.225254 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.224882 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.225110 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.225460 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.225568 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:43 crc kubenswrapper[4755]: I0320 13:32:43.226278 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:43 crc kubenswrapper[4755]: E0320 13:32:43.226484 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:44 crc kubenswrapper[4755]: I0320 13:32:44.224555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:44 crc kubenswrapper[4755]: E0320 13:32:44.224777 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225286 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225383 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225472 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:45 crc kubenswrapper[4755]: I0320 13:32:45.225458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225576 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:45 crc kubenswrapper[4755]: E0320 13:32:45.225804 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:46 crc kubenswrapper[4755]: I0320 13:32:46.224864 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:46 crc kubenswrapper[4755]: E0320 13:32:46.225030 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:46 crc kubenswrapper[4755]: E0320 13:32:46.340887 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225285 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225439 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225553 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:47 crc kubenswrapper[4755]: I0320 13:32:47.225637 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:47 crc kubenswrapper[4755]: E0320 13:32:47.225756 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:48 crc kubenswrapper[4755]: I0320 13:32:48.224980 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:48 crc kubenswrapper[4755]: E0320 13:32:48.225162 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.225843 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226125 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226240 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:49 crc kubenswrapper[4755]: E0320 13:32:49.226385 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:49 crc kubenswrapper[4755]: I0320 13:32:49.227877 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.059557 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.063167 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.063754 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.082116 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.099681 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.114525 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.129256 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.143112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.155422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162925 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162966 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162979 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.162998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.163010 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.168330 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.175951 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.179866 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.179917 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180115 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180135 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.180147 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.182983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.192296 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195784 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195799 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195827 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.195854 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.198318 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.208836 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.210491 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212609 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212674 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212718 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212779 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.212793 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.225184 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.225208 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.225478 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.228887 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233162 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233218 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233232 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233252 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.233264 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:32:50Z","lastTransitionTime":"2026-03-20T13:32:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.243734 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.249485 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: E0320 13:32:50.249609 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.262686 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.279112 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.291998 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.316768 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:50 crc kubenswrapper[4755]: I0320 13:32:50.331405 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.051238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.051433 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.051994 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:23.051963823 +0000 UTC m=+182.649896382 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.070634 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.071573 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/1.log" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075767 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" exitCode=1 Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556"} Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.075950 4755 scope.go:117] "RemoveContainer" containerID="008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.076828 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.077283 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.100475 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.143712 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.152394 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.152476 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152714 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.152674815 +0000 UTC m=+214.750607334 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152792 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.152936 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.152899231 +0000 UTC m=+214.750831800 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.153534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.153761 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.153975 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.153953122 +0000 UTC m=+214.751885661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.165691 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.185821 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.209281 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.225802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227337 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.226455 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227584 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.225825 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.227761 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.228639 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.252269 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.254747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.254798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255003 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255030 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255045 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255103 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.255087755 +0000 UTC m=+214.853020294 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255120 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255153 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255172 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.255236 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:33:55.255218429 +0000 UTC m=+214.853150978 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.269313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.287383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.303439 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.324153 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: E0320 13:32:51.342191 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.344458 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.371098 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.385518 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.404037 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.420338 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.432404 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.445687 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.468797 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.483255 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.503364 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.521861 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.540313 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.555127 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.575947 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.592342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.608419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.624164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.639561 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.654435 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.672346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://008046974bed0976b38fd2acd866adb2a639127a5b02cac815ffb52cad079c91\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:34Z\\\",\\\"message\\\":\\\"20 13:32:34.337669 6895 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 13:32:34.337718 6895 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 13:32:34.337795 6895 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 13:32:34.337878 6895 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 13:32:34.337890 6895 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 13:32:34.337938 6895 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 13:32:34.337969 6895 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 13:32:34.337983 6895 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 13:32:34.337996 6895 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 13:32:34.338007 6895 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 13:32:34.338019 6895 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 13:32:34.340241 6895 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 13:32:34.340278 6895 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 13:32:34.340354 6895 factory.go:656] Stopping watch factory\\\\nI0320 13:32:34.340383 6895 ovnkube.go:599] Stopped ovnkube\\\\nI0320 13:32:34.340376 6895 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 13\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.687641 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.709880 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:51 crc kubenswrapper[4755]: I0320 13:32:51.724694 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.083452 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.089201 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:32:52 crc kubenswrapper[4755]: E0320 13:32:52.089462 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.105400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.140745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.161172 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.178472 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.191816 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.209091 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.224324 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.224829 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:52 crc kubenswrapper[4755]: E0320 13:32:52.225004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.239168 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.252733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.273578 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.293012 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.326431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.344909 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.366808 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.382100 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.399456 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:52 crc kubenswrapper[4755]: I0320 13:32:52.416494 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225024 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:53 crc kubenswrapper[4755]: I0320 13:32:53.225205 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.225578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.226138 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:53 crc kubenswrapper[4755]: E0320 13:32:53.226235 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:54 crc kubenswrapper[4755]: I0320 13:32:54.225685 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:54 crc kubenswrapper[4755]: E0320 13:32:54.225894 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.225421 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225497 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.225802 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:55 crc kubenswrapper[4755]: I0320 13:32:55.225226 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:55 crc kubenswrapper[4755]: E0320 13:32:55.226274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:56 crc kubenswrapper[4755]: I0320 13:32:56.225013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:56 crc kubenswrapper[4755]: E0320 13:32:56.225167 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:56 crc kubenswrapper[4755]: E0320 13:32:56.343618 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.225374 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.226193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:57 crc kubenswrapper[4755]: I0320 13:32:57.225410 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226428 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226708 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:57 crc kubenswrapper[4755]: E0320 13:32:57.226868 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:58 crc kubenswrapper[4755]: I0320 13:32:58.225640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:32:58 crc kubenswrapper[4755]: E0320 13:32:58.225900 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224723 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.224943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.224913 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.225301 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:32:59 crc kubenswrapper[4755]: E0320 13:32:59.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:32:59 crc kubenswrapper[4755]: I0320 13:32:59.241792 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.224943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.225179 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611627 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611696 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611710 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611728 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.611741 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.631801 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.636963 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637195 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637357 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637533 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.637712 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.655826 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660946 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.660989 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.661007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.661019 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.680846 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686724 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686791 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686843 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.686859 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.705271 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.710998 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711465 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711640 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.711978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:00 crc kubenswrapper[4755]: I0320 13:33:00.712177 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:00Z","lastTransitionTime":"2026-03-20T13:33:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.733811 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:00 crc kubenswrapper[4755]: E0320 13:33:00.734086 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.224877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.224891 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.225150 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.225336 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.225900 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.226186 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.241963 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.264335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.286983 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.303477 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.329842 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: E0320 13:33:01.345520 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.365320 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.383704 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.424745 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.438881 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.466017 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.482911 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.497302 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.513509 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.530074 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.545123 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.564073 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.579731 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:01 crc kubenswrapper[4755]: I0320 13:33:01.596800 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:01Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.225544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:02 crc kubenswrapper[4755]: E0320 13:33:02.226084 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.227351 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:02 crc kubenswrapper[4755]: E0320 13:33:02.227644 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:02 crc kubenswrapper[4755]: I0320 13:33:02.244192 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225021 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:03 crc kubenswrapper[4755]: I0320 13:33:03.225187 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226014 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:03 crc kubenswrapper[4755]: E0320 13:33:03.226853 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:04 crc kubenswrapper[4755]: I0320 13:33:04.225145 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:04 crc kubenswrapper[4755]: E0320 13:33:04.225356 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225603 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:05 crc kubenswrapper[4755]: I0320 13:33:05.225869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.225888 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.226033 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:05 crc kubenswrapper[4755]: E0320 13:33:05.226160 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:06 crc kubenswrapper[4755]: I0320 13:33:06.225429 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:06 crc kubenswrapper[4755]: E0320 13:33:06.225689 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:06 crc kubenswrapper[4755]: E0320 13:33:06.346904 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227608 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228209 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:07 crc kubenswrapper[4755]: I0320 13:33:07.227805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228446 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:07 crc kubenswrapper[4755]: E0320 13:33:07.228615 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.225213 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:08 crc kubenswrapper[4755]: E0320 13:33:08.225927 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446683 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446945 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" exitCode=1 Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.446985 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545"} Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.447456 4755 scope.go:117] "RemoveContainer" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.464709 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.479164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.491429 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.517221 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.531241 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.543335 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.560181 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.572418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.587182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.600212 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.611081 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.621342 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.633944 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.647321 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.657835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.669381 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.682809 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.699976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:08 crc kubenswrapper[4755]: I0320 13:33:08.709078 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:08Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.225621 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.225973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.225939 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.226133 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.226139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:09 crc kubenswrapper[4755]: E0320 13:33:09.226239 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.457078 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.457202 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc"} Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.480252 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.498560 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.515785 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.539340 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.562813 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.581780 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.599795 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.624798 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.650849 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.669355 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.690024 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.710440 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.742732 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.760755 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.780408 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.797690 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.812715 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.847749 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:09 crc kubenswrapper[4755]: I0320 13:33:09.865217 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.225255 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.225465 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805679 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.805899 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.827277 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833447 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833510 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833529 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833557 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.833577 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.856380 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861245 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861314 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861344 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861372 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.861392 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.883677 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889342 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889397 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889411 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889429 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.889441 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.902345 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.906955 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.906995 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907008 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907026 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:10 crc kubenswrapper[4755]: I0320 13:33:10.907040 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:10Z","lastTransitionTime":"2026-03-20T13:33:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.920522 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:10 crc kubenswrapper[4755]: E0320 13:33:10.920674 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.225533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.226218 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226485 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.226584 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.226938 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.252360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.278736 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.299936 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.324513 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: E0320 13:33:11.347579 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.352427 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.392075 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.417164 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.440533 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.458935 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.476171 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.495326 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.514222 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.540790 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.557619 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.576175 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.599220 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.613802 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.636418 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:11 crc kubenswrapper[4755]: I0320 13:33:11.650400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:12 crc kubenswrapper[4755]: I0320 13:33:12.225472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:12 crc kubenswrapper[4755]: E0320 13:33:12.226804 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225711 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225711 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:13 crc kubenswrapper[4755]: I0320 13:33:13.225895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.225986 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.226209 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:13 crc kubenswrapper[4755]: E0320 13:33:13.226381 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:14 crc kubenswrapper[4755]: I0320 13:33:14.225293 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:14 crc kubenswrapper[4755]: E0320 13:33:14.225579 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225253 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.225578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.225775 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.225976 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.226109 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:15 crc kubenswrapper[4755]: E0320 13:33:15.226170 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.487023 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.492800 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.498252 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.519849 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.545384 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.562055 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.583552 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.599478 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.617377 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.632613 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.650005 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.673581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.686359 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.700152 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.722013 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.743175 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.757257 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.782179 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.796411 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.824078 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.846187 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:15 crc kubenswrapper[4755]: I0320 13:33:15.882727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:15Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.225407 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.225682 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.349846 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.500286 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.501799 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/2.log" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506467 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" exitCode=1 Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.506694 4755 scope.go:117] "RemoveContainer" containerID="f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.507998 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:16 crc kubenswrapper[4755]: E0320 13:33:16.508333 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.548292 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.569235 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.583121 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.595969 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.607727 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.622927 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.638522 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.660539 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.680737 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.706008 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.737058 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f6fc1a354d62821bbe52af0ec32b1be6969895d031c7400eeb25851ffa53e556\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:32:50Z\\\",\\\"message\\\":\\\" (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0320 13:32:50.379301 7085 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: failed to add event handler: handler {0x1e60340 0x1e60020 0x1e5ffc0} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:32:50Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:32:50.379299 7085 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-oauth-apiserver/api_TCP_cluster\\\\\\\", UUID:\\\\\\\"fe46cb89-4e54-4175-a112-1c5224cd299e\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-oauth-apiserver/api\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.750870 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.775422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.795071 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.815209 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.836353 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.857069 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.871581 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:16 crc kubenswrapper[4755]: I0320 13:33:16.891271 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:16Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.225513 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.225773 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.225924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.226740 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.226863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.227105 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.514544 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.520311 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:17 crc kubenswrapper[4755]: E0320 13:33:17.520578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.535777 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.554628 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.567352 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.598552 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.618957 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.637511 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.658835 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.680345 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.696970 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.715101 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.735482 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.754128 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.774312 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.794725 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.824186 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.857565 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.877516 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.904071 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:17 crc kubenswrapper[4755]: I0320 13:33:17.923360 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:17Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:18 crc kubenswrapper[4755]: I0320 13:33:18.225204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:18 crc kubenswrapper[4755]: E0320 13:33:18.225446 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225523 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225552 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.225815 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:19 crc kubenswrapper[4755]: I0320 13:33:19.225832 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.226045 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:19 crc kubenswrapper[4755]: E0320 13:33:19.225896 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:20 crc kubenswrapper[4755]: I0320 13:33:20.225198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:20 crc kubenswrapper[4755]: E0320 13:33:20.225822 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085335 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085419 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085443 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085474 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.085495 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.114215 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120494 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120568 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120586 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120613 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.120631 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.143370 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149402 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149454 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149470 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149495 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.149513 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.169177 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174620 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174643 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174697 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.174717 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.196028 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201212 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201333 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201408 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201442 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.201467 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:21Z","lastTransitionTime":"2026-03-20T13:33:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.225217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.225313 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.225687 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.226165 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226333 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.226996 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.227267 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.259519 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.278743 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.299427 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.316425 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.335181 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: E0320 13:33:21.350616 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.356169 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.375622 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.395534 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.416967 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.444897 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.478877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.494938 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.516412 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.540473 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.572077 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.591877 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.613847 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.632858 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:21 crc kubenswrapper[4755]: I0320 13:33:21.651049 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:22 crc kubenswrapper[4755]: I0320 13:33:22.225717 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:22 crc kubenswrapper[4755]: E0320 13:33:22.225925 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.147179 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.147409 4755 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.148058 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs podName:37d1e037-c169-4932-9928-f3d23ff47c07 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:27.148028335 +0000 UTC m=+246.745960894 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs") pod "network-metrics-daemon-kpm42" (UID: "37d1e037-c169-4932-9928-f3d23ff47c07") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.225723 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.225816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.226291 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.226473 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:23 crc kubenswrapper[4755]: I0320 13:33:23.227015 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:23 crc kubenswrapper[4755]: E0320 13:33:23.227202 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:24 crc kubenswrapper[4755]: I0320 13:33:24.225206 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:24 crc kubenswrapper[4755]: E0320 13:33:24.225413 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225564 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225736 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.225866 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:25 crc kubenswrapper[4755]: I0320 13:33:25.225920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.226176 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:25 crc kubenswrapper[4755]: E0320 13:33:25.226390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:26 crc kubenswrapper[4755]: I0320 13:33:26.225132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:26 crc kubenswrapper[4755]: E0320 13:33:26.225345 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:26 crc kubenswrapper[4755]: E0320 13:33:26.352143 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.225868 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.226025 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:27 crc kubenswrapper[4755]: I0320 13:33:27.226270 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226458 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226619 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:27 crc kubenswrapper[4755]: E0320 13:33:27.226822 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:28 crc kubenswrapper[4755]: I0320 13:33:28.224962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:28 crc kubenswrapper[4755]: E0320 13:33:28.225325 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.225182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.225412 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.225806 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.225945 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:29 crc kubenswrapper[4755]: I0320 13:33:29.226090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:29 crc kubenswrapper[4755]: E0320 13:33:29.226407 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:30 crc kubenswrapper[4755]: I0320 13:33:30.225297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:30 crc kubenswrapper[4755]: E0320 13:33:30.225518 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225147 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225548 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225620 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.225853 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.225960 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.253841 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.272614 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.292364 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.324492 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.350647 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.353459 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.373733 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.392109 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.409120 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.428761 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.451611 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.472400 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.489531 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.507840 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517282 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517639 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517825 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.517962 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.518060 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.527001 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.531493 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536415 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536499 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536521 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536589 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.536607 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.551565 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.556176 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557549 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557594 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557608 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557628 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.557641 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.572321 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.573550 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579460 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579734 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579804 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579878 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.579943 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.594391 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.597953 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602033 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602081 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602093 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602113 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.602123 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:31Z","lastTransitionTime":"2026-03-20T13:33:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.608878 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.615354 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:31 crc kubenswrapper[4755]: E0320 13:33:31.615521 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:31 crc kubenswrapper[4755]: I0320 13:33:31.623044 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:32 crc kubenswrapper[4755]: I0320 13:33:32.224901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:32 crc kubenswrapper[4755]: E0320 13:33:32.226115 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225709 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.225554 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.225865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.226002 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.226590 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:33 crc kubenswrapper[4755]: I0320 13:33:33.227184 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:33 crc kubenswrapper[4755]: E0320 13:33:33.227452 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:34 crc kubenswrapper[4755]: I0320 13:33:34.234561 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:34 crc kubenswrapper[4755]: E0320 13:33:34.234787 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225051 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225136 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225289 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:35 crc kubenswrapper[4755]: I0320 13:33:35.225169 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225432 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:35 crc kubenswrapper[4755]: E0320 13:33:35.225772 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:36 crc kubenswrapper[4755]: I0320 13:33:36.225131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:36 crc kubenswrapper[4755]: E0320 13:33:36.225377 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:36 crc kubenswrapper[4755]: E0320 13:33:36.356375 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225439 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225518 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:37 crc kubenswrapper[4755]: I0320 13:33:37.225597 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.225777 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.226052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:37 crc kubenswrapper[4755]: E0320 13:33:37.226419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:38 crc kubenswrapper[4755]: I0320 13:33:38.225699 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:38 crc kubenswrapper[4755]: E0320 13:33:38.226309 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225603 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:39 crc kubenswrapper[4755]: I0320 13:33:39.225851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226042 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226117 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:39 crc kubenswrapper[4755]: E0320 13:33:39.226190 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:40 crc kubenswrapper[4755]: I0320 13:33:40.225412 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:40 crc kubenswrapper[4755]: E0320 13:33:40.225752 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225722 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.225997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.225988 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.226189 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.226946 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.259922 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0de398a-6f32-4b1c-a840-10ff45da7251\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:16Z\\\",\\\"message\\\":\\\"ions:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.219:80:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7594bb65-e742-44b3-a975-d639b1128be5}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.349102 7381 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-machine-api/machine-api-controllers]} name:Service_openshift-machine-api/machine-api-controllers_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.167:8441: 10.217.4.167:8442: 10.217.4.167:8444:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {62af83f3-e0c8-4632-aaaa-17488566a9d8}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:33:16.348706 7381 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 13:33:16.349357 7381 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:33:15Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jlq8b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bd25w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.277987 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-kpm42" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37d1e037-c169-4932-9928-f3d23ff47c07\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-24qzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:19Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-kpm42\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.299788 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c5a2944-296d-48ba-915d-640503b92beb\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:22Z\\\",\\\"message\\\":\\\"W0320 13:31:21.524052 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 13:31:21.525165 1 crypto.go:601] Generating new CA for check-endpoints-signer@1774013481 cert, and key in /tmp/serving-cert-1633750751/serving-signer.crt, /tmp/serving-cert-1633750751/serving-signer.key\\\\nI0320 13:31:21.861337 1 observer_polling.go:159] Starting file observer\\\\nW0320 13:31:21.866235 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:21Z is after 2026-02-23T05:33:16Z\\\\nI0320 13:31:21.866492 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:31:21.867493 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1633750751/tls.crt::/tmp/serving-cert-1633750751/tls.key\\\\\\\"\\\\nF0320 13:31:22.243391 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:22Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:31:21Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.318976 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5a67e7d816ef6b4e220f8f84d21c09688dd893c38acd983fdf77e5f4b65d41f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.341346 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.357749 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.365419 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.385393 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa13631f-58da-4411-8e94-2385741a977e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7d3d54b179a35cf86200cd288aca8a39ea845c332b122756c3e877becc1967bf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://91330cd9a8812e6eaccdac0ffe777bddf5576b99a9103020e0fe2add54811aca\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42345c330ba85dcf9f3fe1df39cbd8aba5fcf70db093b3265c6b8c773cb2130\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://169f49865aae0f218b414830c82f04156092a930126bcf2867c8da789e93cd1c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://06768038b4570b9971119b0d57a4d2772e9bd9b06c46e191c70f0b3c3b4d4869\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://63132e5b2522949bf8994a23e95934a00650091d635a37f6565939154bd3c122\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a7ce6c10f35d3dd9fed3fcccaae69d6d59ce05a4d1fcd1065f00a5affcebd6d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lctxm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-dmzsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.408608 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74e794f3-35a8-44bb-b160-caa2a9cd93c9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c5eeb395c97b60f808f1a7ebe356103a9b864be53b40ea945b51da2384acc8a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ae68252c52d1a57db174f378550940bf377525854f49de87d9140dae8f13b8a8\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:31:18Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:30:49.212757 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:30:49.214715 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:30:49.216720 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:30:49.217873 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:31:18.748305 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:31:18.748409 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:31:17Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:49Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:31:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1aa485693c1915273a9c4c67738bbef8d94ab3126817f429f3d178aaf604b314\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ce68d335061abebd652ab60ff0dbaa6762d3b33aa001f8d054906e82fc9185\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.425498 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"677d46ff-b6a3-4f31-b051-9aee1edfb21f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a8f3a6ad6c5b56f09ff7ce36a945f28bd87355c700970847a78cddb50433dc6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50e7905db814ea2191da90c45c9b2784cb605b44469a456d1f0bcaea559e8e13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.439860 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-zf67p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"378696d3-72aa-4101-9746-a2b0d203f525\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f626f401039ad87430d684d2223f4473a06e96f17c2e82d2de5971e50e2df61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h5qsm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-zf67p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.461757 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5afc1dd9-e489-4403-9fb9-a9b35f988887\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a79e0052169e98c156a01e0bdafb6700ad041dbf9914f6bb0a4f52e3216dd09a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b9a9b521ee70949d2c89f9cc0a3e6f3b631bd0d27c1a931ef744f1035f43b02\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5eea42f2f2c1a41d33be15801f7d8b112787004549c6bb0d20ae479935d22e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a30ba9738ea434da61666bf7967630d2ce898c6a343e1e8e86216619989da5c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://94cbf7c8dd3eef8361c4451c8c6e2bc34a693cc06e938c7810eb4b062ed89251\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ddfebe642658386a94613e6a0d7e5055a4572a9f8c129aa6545fc0006a6e4ae6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8a81734eb07068c651338a5e8af0c43de081ac684b791b18af19fb3758d42c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f0f664dbc3cf39a77899afd974ec9bde1f615a4ad5e370ae837df140eed5368c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.477969 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3eb406f6-1a26-4eea-84ac-e55f5232900c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fac45bc89404b6526bff9c8cd7582483933d617f0c20173871a4e9c3be9cccd7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rknk9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xmn6s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.492383 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-b9bt2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"741d8a76-423b-4e13-aedb-fff0e87a207c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fb54c7f2bed02b7d01ebbbff9fde3da944a62cd19e6926daf3cfa4185ad039a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w4m4w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-b9bt2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.514239 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"18256fa3-a343-4dc3-8c00-f6f5de000b4b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fb67433a2af5331e1c55d30bc3cd96ca38562c75807d7449466a3c5677b746e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf6a53789eff04dc3c8b54e47d60217ba95253029f601cd44cce24af2de75318\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dcct2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:18Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-4mcbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.530182 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"84ffc52f-627e-40be-983b-76214d84bc13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:30:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7c29dffc61b699aaf301b67a0028d0cce35173ff24cd2ebc7ab4f7385aad2d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cdfa16340bf439fad161926890607928dfbe04d5a82aa3d88d0475dfe23ed52b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e70fb8d0c32b04610bb48bff7818eb9f57452a06a98b386dcffd2cf902600a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:30:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://406eb3c25781d1e470ca0fc2bd1768631aa4593a28d33c660a8562c1ec580e0d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:30:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:30:22Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:30:21Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.552599 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.572422 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://195c2f0a4d6002e2014a722b6a36b63e0449682b94d645bc62c73855bbce7595\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.593461 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:31:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7338abbc140a2880b216f09f617a08f67e9b72cc8c3267cd909b20097edfcc81\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d06a024d128acfd973d0bfcf03b70b831ab199bf76320fb83f647055818ef6cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:32:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.610431 4755 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-8btvn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e5ba4f17-8c41-4124-b563-01d5f1751139\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:32:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:33:08Z\\\",\\\"message\\\":\\\"2026-03-20T13:32:22+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8\\\\n2026-03-20T13:32:22+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f8da8f42-c6fc-444d-99e3-fe8d63af21e8 to /host/opt/cni/bin/\\\\n2026-03-20T13:32:23Z [verbose] multus-daemon started\\\\n2026-03-20T13:32:23Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:33:08Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:32:21Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:33:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p9w8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:32:06Z\\\"}}\" for pod \"openshift-multus\"/\"multus-8btvn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766401 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766487 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766507 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766534 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.766554 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.784562 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789882 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789959 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.789978 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.790007 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.790027 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.808910 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813530 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813587 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813607 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813635 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.813690 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.829956 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835016 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835119 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835144 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835174 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.835205 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.852598 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858043 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858099 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858118 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858145 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:41 crc kubenswrapper[4755]: I0320 13:33:41.858168 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:41Z","lastTransitionTime":"2026-03-20T13:33:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.881476 4755 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:33:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"382501ad-cb22-4ccb-a572-771d7a82be1e\\\",\\\"systemUUID\\\":\\\"ec91ed1b-a6ed-4cb2-884d-632a869fcc2d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:33:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:33:41 crc kubenswrapper[4755]: E0320 13:33:41.881624 4755 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:33:42 crc kubenswrapper[4755]: I0320 13:33:42.225312 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:42 crc kubenswrapper[4755]: E0320 13:33:42.225578 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.225297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.225561 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.225944 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.226075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:43 crc kubenswrapper[4755]: I0320 13:33:43.226116 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:43 crc kubenswrapper[4755]: E0320 13:33:43.226278 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:44 crc kubenswrapper[4755]: I0320 13:33:44.225702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:44 crc kubenswrapper[4755]: E0320 13:33:44.226113 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.224966 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.225269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.225250 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.225514 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.225599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.226088 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:45 crc kubenswrapper[4755]: I0320 13:33:45.226488 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:45 crc kubenswrapper[4755]: E0320 13:33:45.226744 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bd25w_openshift-ovn-kubernetes(e0de398a-6f32-4b1c-a840-10ff45da7251)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" Mar 20 13:33:46 crc kubenswrapper[4755]: I0320 13:33:46.224995 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:46 crc kubenswrapper[4755]: E0320 13:33:46.225274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:46 crc kubenswrapper[4755]: E0320 13:33:46.358964 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225437 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:47 crc kubenswrapper[4755]: I0320 13:33:47.225512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.225648 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.225826 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:47 crc kubenswrapper[4755]: E0320 13:33:47.226066 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:48 crc kubenswrapper[4755]: I0320 13:33:48.225606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:48 crc kubenswrapper[4755]: E0320 13:33:48.225969 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225593 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.225824 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:49 crc kubenswrapper[4755]: I0320 13:33:49.225870 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.226195 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:49 crc kubenswrapper[4755]: E0320 13:33:49.226281 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:50 crc kubenswrapper[4755]: I0320 13:33:50.224706 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:50 crc kubenswrapper[4755]: E0320 13:33:50.224929 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225424 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225453 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.225489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.226598 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.226701 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.227338 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:51 crc kubenswrapper[4755]: E0320 13:33:51.361758 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.364554 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-dmzsb" podStartSLOduration=139.364539906 podStartE2EDuration="2m19.364539906s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.331356355 +0000 UTC m=+210.929288884" watchObservedRunningTime="2026-03-20 13:33:51.364539906 +0000 UTC m=+210.962472435" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.403693 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=117.403635627 podStartE2EDuration="1m57.403635627s" podCreationTimestamp="2026-03-20 13:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.403529564 +0000 UTC m=+211.001462153" watchObservedRunningTime="2026-03-20 13:33:51.403635627 +0000 UTC m=+211.001568166" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.415162 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=49.415126107 podStartE2EDuration="49.415126107s" podCreationTimestamp="2026-03-20 13:33:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.414800727 +0000 UTC m=+211.012733286" watchObservedRunningTime="2026-03-20 13:33:51.415126107 +0000 UTC m=+211.013058676" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.431852 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zf67p" podStartSLOduration=139.431826356 podStartE2EDuration="2m19.431826356s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.431028641 +0000 UTC m=+211.028961210" watchObservedRunningTime="2026-03-20 13:33:51.431826356 +0000 UTC m=+211.029758895" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.460199 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=52.460170899 podStartE2EDuration="52.460170899s" podCreationTimestamp="2026-03-20 13:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.459924032 +0000 UTC m=+211.057856581" watchObservedRunningTime="2026-03-20 13:33:51.460170899 +0000 UTC m=+211.058103438" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.475539 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podStartSLOduration=139.475509226 podStartE2EDuration="2m19.475509226s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.474504346 +0000 UTC m=+211.072436895" watchObservedRunningTime="2026-03-20 13:33:51.475509226 +0000 UTC m=+211.073441775" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.578697 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=104.578633898 podStartE2EDuration="1m44.578633898s" podCreationTimestamp="2026-03-20 13:32:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.555946367 +0000 UTC m=+211.153878926" watchObservedRunningTime="2026-03-20 13:33:51.578633898 +0000 UTC m=+211.176566437" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.624356 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8btvn" podStartSLOduration=139.62433915 podStartE2EDuration="2m19.62433915s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.624013211 +0000 UTC m=+211.221945740" watchObservedRunningTime="2026-03-20 13:33:51.62433915 +0000 UTC m=+211.222271679" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.653741 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-b9bt2" podStartSLOduration=139.653711696 podStartE2EDuration="2m19.653711696s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.639638077 +0000 UTC m=+211.237570606" watchObservedRunningTime="2026-03-20 13:33:51.653711696 +0000 UTC m=+211.251644225" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.653901 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-4mcbn" podStartSLOduration=139.653890401 podStartE2EDuration="2m19.653890401s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.652693995 +0000 UTC m=+211.250626524" watchObservedRunningTime="2026-03-20 13:33:51.653890401 +0000 UTC m=+211.251822930" Mar 20 13:33:51 crc kubenswrapper[4755]: I0320 13:33:51.667602 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=89.667577128 podStartE2EDuration="1m29.667577128s" podCreationTimestamp="2026-03-20 13:32:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:51.667386113 +0000 UTC m=+211.265318642" watchObservedRunningTime="2026-03-20 13:33:51.667577128 +0000 UTC m=+211.265509657" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074637 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074738 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074757 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074782 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.074802 4755 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:33:52Z","lastTransitionTime":"2026-03-20T13:33:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.151409 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7"] Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.152269 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.155073 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.155461 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.164425 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.164791 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205461 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205547 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205604 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.205808 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.225246 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:52 crc kubenswrapper[4755]: E0320 13:33:52.225607 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.287768 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.306475 4755 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.306998 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307237 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.307321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0efe9b26-e5b1-4167-bc7e-41c7d836013d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.311591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0efe9b26-e5b1-4167-bc7e-41c7d836013d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.314688 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0efe9b26-e5b1-4167-bc7e-41c7d836013d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.327640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0efe9b26-e5b1-4167-bc7e-41c7d836013d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-r4qh7\" (UID: \"0efe9b26-e5b1-4167-bc7e-41c7d836013d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.478174 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.674917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" event={"ID":"0efe9b26-e5b1-4167-bc7e-41c7d836013d","Type":"ContainerStarted","Data":"e9b47cfe036a54290ccb8766855d3184e24f28e174d798b12ce50d5671fb9c0b"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.674974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" event={"ID":"0efe9b26-e5b1-4167-bc7e-41c7d836013d","Type":"ContainerStarted","Data":"70252601887caca1e705617eecd9b0caf1e463f9164adc83eb8f5158ae933436"} Mar 20 13:33:52 crc kubenswrapper[4755]: I0320 13:33:52.694585 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-r4qh7" podStartSLOduration=140.694556156 podStartE2EDuration="2m20.694556156s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:52.691236385 +0000 UTC m=+212.289168914" watchObservedRunningTime="2026-03-20 13:33:52.694556156 +0000 UTC m=+212.292488715" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.224980 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.225042 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:53 crc kubenswrapper[4755]: I0320 13:33:53.225093 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225165 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225231 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:53 crc kubenswrapper[4755]: E0320 13:33:53.225343 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.225336 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:54 crc kubenswrapper[4755]: E0320 13:33:54.225987 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.686915 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687626 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/0.log" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687734 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" exitCode=1 Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687785 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc"} Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.687836 4755 scope.go:117] "RemoveContainer" containerID="0b13d189f96ee70b5352b0f20ca1d7a35d0a65caaa2a4628586b0efdecb55545" Mar 20 13:33:54 crc kubenswrapper[4755]: I0320 13:33:54.688833 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:33:54 crc kubenswrapper[4755]: E0320 13:33:54.689294 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225468 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225565 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.225331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225754 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.225914 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.246838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.247038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.247152 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247298 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247393 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247366923 +0000 UTC m=+336.845299492 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247493 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247477666 +0000 UTC m=+336.845410235 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247640 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.247734 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.247720284 +0000 UTC m=+336.845652853 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.348027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.348126 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348332 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348400 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348471 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348411 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348537 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348563 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348615 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.348570696 +0000 UTC m=+336.946503385 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: E0320 13:33:55.348695 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:35:57.348636038 +0000 UTC m=+336.946568837 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:33:55 crc kubenswrapper[4755]: I0320 13:33:55.695952 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:33:56 crc kubenswrapper[4755]: I0320 13:33:56.224985 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:56 crc kubenswrapper[4755]: E0320 13:33:56.225244 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:56 crc kubenswrapper[4755]: E0320 13:33:56.363114 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231060 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.231272 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.231733 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:57 crc kubenswrapper[4755]: I0320 13:33:57.231978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:57 crc kubenswrapper[4755]: E0320 13:33:57.232084 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.224776 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:33:58 crc kubenswrapper[4755]: E0320 13:33:58.226033 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.226241 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.711939 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.716197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerStarted","Data":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.717887 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:33:58 crc kubenswrapper[4755]: I0320 13:33:58.777390 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podStartSLOduration=146.777367981 podStartE2EDuration="2m26.777367981s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:33:58.775467092 +0000 UTC m=+218.373399631" watchObservedRunningTime="2026-03-20 13:33:58.777367981 +0000 UTC m=+218.375300520" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.195626 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.195797 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.195923 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.225676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:33:59 crc kubenswrapper[4755]: I0320 13:33:59.225792 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.225852 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:33:59 crc kubenswrapper[4755]: E0320 13:33:59.225990 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:00 crc kubenswrapper[4755]: I0320 13:34:00.225247 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:00 crc kubenswrapper[4755]: E0320 13:34:00.226056 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.224879 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.224973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:01 crc kubenswrapper[4755]: I0320 13:34:01.225073 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227274 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227420 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.227551 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:01 crc kubenswrapper[4755]: E0320 13:34:01.363751 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:02 crc kubenswrapper[4755]: I0320 13:34:02.224979 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:02 crc kubenswrapper[4755]: E0320 13:34:02.225426 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225456 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225575 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.225707 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.225849 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:03 crc kubenswrapper[4755]: I0320 13:34:03.225962 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:03 crc kubenswrapper[4755]: E0320 13:34:03.226224 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:04 crc kubenswrapper[4755]: I0320 13:34:04.224715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:04 crc kubenswrapper[4755]: E0320 13:34:04.224925 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225590 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225689 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.225887 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:05 crc kubenswrapper[4755]: I0320 13:34:05.225922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.226160 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:05 crc kubenswrapper[4755]: E0320 13:34:05.226302 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.225188 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:06 crc kubenswrapper[4755]: E0320 13:34:06.225410 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.226209 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:34:06 crc kubenswrapper[4755]: E0320 13:34:06.405428 4755 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.754961 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:34:06 crc kubenswrapper[4755]: I0320 13:34:06.755072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552"} Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.150284 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224938 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:07 crc kubenswrapper[4755]: I0320 13:34:07.224964 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225156 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225336 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:07 crc kubenswrapper[4755]: E0320 13:34:07.225581 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:08 crc kubenswrapper[4755]: I0320 13:34:08.224862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:08 crc kubenswrapper[4755]: E0320 13:34:08.225130 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.224783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.224783 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.224981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:09 crc kubenswrapper[4755]: I0320 13:34:09.225086 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.225195 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:09 crc kubenswrapper[4755]: E0320 13:34:09.225315 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:10 crc kubenswrapper[4755]: I0320 13:34:10.224747 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:10 crc kubenswrapper[4755]: E0320 13:34:10.225004 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225259 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225421 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:11 crc kubenswrapper[4755]: I0320 13:34:11.225583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.227451 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-kpm42" podUID="37d1e037-c169-4932-9928-f3d23ff47c07" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.228001 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:34:11 crc kubenswrapper[4755]: E0320 13:34:11.227875 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.225371 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.228968 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.229582 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.464248 4755 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.530997 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.531714 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532318 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532388 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.532875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.533411 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.534161 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.534432 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.548006 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.549261 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.550842 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551257 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551417 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551543 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551761 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.551902 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.552054 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.552230 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.557310 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.557845 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.558041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.563858 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.568894 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.572328 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.574351 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.575709 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.576305 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.585368 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.602402 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.603301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611302 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611402 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612042 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612206 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611889 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612326 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.611937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612430 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612466 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612018 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612682 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612817 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612883 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612927 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612950 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612973 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613048 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613149 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613173 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613244 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613267 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.613356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612056 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612147 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612185 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.612223 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.633149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634043 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634387 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634598 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634776 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.634605 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.635056 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637131 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637371 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637529 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637722 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.637880 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.638861 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639061 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639086 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639218 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639323 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639418 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639426 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.639683 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.640803 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.641322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.642899 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643409 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643629 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.643863 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644015 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644204 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.644835 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.645781 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.650559 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.650877 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.655146 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.676705 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.677308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.677836 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.697116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.697692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698074 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698374 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698501 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.698521 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.699804 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.700412 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.706480 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707025 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707766 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.707988 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708220 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708310 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708399 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708461 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708572 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708581 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708777 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.708913 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709039 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709185 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.709594 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711494 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711573 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711785 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711799 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711884 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711889 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711937 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.711977 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.712003 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714413 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714443 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714471 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714539 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.714668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.715852 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.715918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.717362 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-auth-proxy-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.717599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39556510-7df7-4b2f-94d6-75b649060c22-config\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.718234 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.719192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.724217 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.725590 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-node-pullsecrets\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731790 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731843 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731970 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.731995 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732066 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732116 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732249 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732279 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732311 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732474 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732579 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732629 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732766 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732789 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732820 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732920 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.732967 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733000 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733158 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733182 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.733256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.734562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.735882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-audit\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.737980 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.738469 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.738815 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-encryption-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.739855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-service-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.740326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.740676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-image-import-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741166 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-config\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741335 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-policies\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741642 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.741989 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b41fdebf-1886-4b30-b583-368242316562-audit-dir\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.742704 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.742961 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743128 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/45c99095-eab0-49c4-8ded-fc5359b43ef2-audit-dir\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743487 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743811 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-config\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.743854 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744125 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-etcd-serving-ca\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744366 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.744870 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.745041 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.774101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-config\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.774687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-etcd-client\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.778267 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/39556510-7df7-4b2f-94d6-75b649060c22-machine-approver-tls\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.778588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-serving-cert\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.779441 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.779900 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-serving-cert\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.780294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-etcd-client\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.784894 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.786426 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41fdebf-1886-4b30-b583-368242316562-serving-cert\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.787238 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/45c99095-eab0-49c4-8ded-fc5359b43ef2-encryption-config\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.787377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.788428 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.788703 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.790448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.791321 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.792487 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.793444 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.794915 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.800357 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.802574 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.806122 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.808625 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.811741 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.812754 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.813588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.813622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.818148 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b41fdebf-1886-4b30-b583-368242316562-trusted-ca-bundle\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822369 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822447 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.822631 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.823591 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.824113 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-r48mq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.824948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825282 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825508 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825710 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.825719 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827002 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827556 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.827582 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l499t\" (UniqueName: \"kubernetes.io/projected/a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d-kube-api-access-l499t\") pod \"authentication-operator-69f744f599-xl5tr\" (UID: \"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh79l\" (UniqueName: \"kubernetes.io/projected/39556510-7df7-4b2f-94d6-75b649060c22-kube-api-access-fh79l\") pod \"machine-approver-56656f9798-h4gd5\" (UID: \"39556510-7df7-4b2f-94d6-75b649060c22\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.828763 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.829265 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.830459 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd4b4\" (UniqueName: \"kubernetes.io/projected/b41fdebf-1886-4b30-b583-368242316562-kube-api-access-sd4b4\") pod \"apiserver-76f77b778f-6ql2s\" (UID: \"b41fdebf-1886-4b30-b583-368242316562\") " pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.830505 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.831599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.832508 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833169 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833446 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833582 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl9hf\" (UniqueName: \"kubernetes.io/projected/7b5aa2ba-0ffb-4094-9200-0f209d9e7fec-kube-api-access-kl9hf\") pod \"openshift-apiserver-operator-796bbdcf4f-4kg9k\" (UID: \"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.833981 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834171 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834341 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834422 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834442 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834517 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834543 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834684 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834790 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834809 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834845 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.834865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835202 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835242 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835269 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835289 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835428 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835448 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.835575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.836314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/efeb6afa-e175-4bad-a0bb-5ace61619959-available-featuregates\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.836955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.839989 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"route-controller-manager-6576b87f9c-g4ftg\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.841368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-config\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.842142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.842768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846394 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846590 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.846959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1fdd6691-9136-43ba-abea-7ba6862e9681-images\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847176 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847808 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.847906 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848101 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848234 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848285 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f67c724-386b-4736-ace1-73430edd3558-metrics-tls\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.848904 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nm7\" (UniqueName: \"kubernetes.io/projected/45c99095-eab0-49c4-8ded-fc5359b43ef2-kube-api-access-n7nm7\") pod \"apiserver-7bbb656c7d-92hxn\" (UID: \"45c99095-eab0-49c4-8ded-fc5359b43ef2\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.849567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850472 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.850559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/7e3cedcb-923f-4cf5-b344-dd3842309d39-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853098 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/efeb6afa-e175-4bad-a0bb-5ace61619959-serving-cert\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.853986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.854605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.855368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1fdd6691-9136-43ba-abea-7ba6862e9681-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.855562 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.857331 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.858232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"controller-manager-879f6c89f-pz64x\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.860900 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.863162 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.866148 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.866598 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.867538 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.868200 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.868585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.870885 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.872076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.873975 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.875109 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.875548 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.876014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.876298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.878409 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.878532 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.880008 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.880804 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.881166 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.881767 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.882229 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.883361 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.883829 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-2b4nn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.884325 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.884858 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.885333 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.886729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.887956 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.889223 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.889810 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.892853 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.894064 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.894460 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.895107 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.897127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.898121 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.899417 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.900735 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.902282 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.902952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.903937 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.904971 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.906267 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.906922 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.908075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.909548 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.909983 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.910169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.911367 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.912627 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.913902 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.915159 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.915977 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.916428 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.917231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.918589 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.919572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.920592 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.921467 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.922445 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.923445 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925028 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925380 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.925936 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.927026 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.928314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.930882 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.934313 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.934516 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.935257 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.935684 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.936864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.938127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.939536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.941333 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.945037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.946589 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.951855 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.951988 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952460 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952698 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.952871 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.953787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.953897 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.954778 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-trusted-ca\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955287 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ff5ba16-93f9-4313-a857-23a1c87c1cac-config\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.955690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956316 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956471 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956582 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956608 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956680 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956780 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.956852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.957683 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.958217 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.959789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1b9cfbce-3f17-4155-a022-243e6d220bf8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.959915 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/862bcbad-0c15-4e2d-b205-83ab3721cd9a-config\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.961516 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/862bcbad-0c15-4e2d-b205-83ab3721cd9a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.962556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.963466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/04d33b23-44ac-48b5-8981-fe9a764b1bee-metrics-tls\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.968866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ff5ba16-93f9-4313-a857-23a1c87c1cac-serving-cert\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.970759 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.973102 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.976475 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/04d33b23-44ac-48b5-8981-fe9a764b1bee-trusted-ca\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.978439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.980880 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/1b9cfbce-3f17-4155-a022-243e6d220bf8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.981215 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.988978 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.995688 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:12 crc kubenswrapper[4755]: I0320 13:34:12.996550 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.007117 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.017498 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.036137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.055372 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058412 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058450 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058589 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058609 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058636 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.058725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.100921 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.116626 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.135177 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.155952 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.176240 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.198869 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.198932 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.222530 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.224989 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.225031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.225210 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.235873 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.241716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.260917 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.275725 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.297636 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.311561 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.316534 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.335744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.351123 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-xl5tr"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.355097 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.372381 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0bdc24c_c8d3_458e_8cf8_d91164ef2b9d.slice/crio-39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970 WatchSource:0}: Error finding container 39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970: Status 404 returned error can't find the container with id 39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.376087 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.395566 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.416364 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.435324 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.456760 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.476005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.496911 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.516093 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.535130 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.555437 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.576069 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.585634 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.586687 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-6ql2s"] Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.595770 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.615195 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.635359 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.655598 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41fdebf_1886_4b30_b583_368242316562.slice/crio-5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab WatchSource:0}: Error finding container 5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab: Status 404 returned error can't find the container with id 5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab Mar 20 13:34:13 crc kubenswrapper[4755]: W0320 13:34:13.656001 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53c8ae1f_e5a9_4ac8_8af7_2169378af3d2.slice/crio-3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198 WatchSource:0}: Error finding container 3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198: Status 404 returned error can't find the container with id 3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.661881 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.677001 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.697476 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.716631 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.736435 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.756545 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.778327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.787195 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b27e760-b22d-415a-93cc-866c2471ee63-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.795089 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.801182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b27e760-b22d-415a-93cc-866c2471ee63-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.808564 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" event={"ID":"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec","Type":"ContainerStarted","Data":"ba751e571165d321957db0bbe94ca3f2e5f473370ae72a84257ecb497533f74a"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.808627 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" event={"ID":"7b5aa2ba-0ffb-4094-9200-0f209d9e7fec","Type":"ContainerStarted","Data":"5672dda1d5c9b160a4b7b486a54671b0648f1faefc7e70e38fa9fc14f2d3dc47"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.810983 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerStarted","Data":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.811070 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerStarted","Data":"ea03e21c825372e4f508e4183f07bab9440aa36d8af7963578ed0bad5bcf3f8f"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.811814 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.813544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerStarted","Data":"3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815116 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815511 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pz64x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.815572 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"45890e5f4598060274cae5d6900124de864ae12693826b146db8417bf5528c14"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816205 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"75d986ee8bd4078a821be23754658900dbf0ee509d3df8503a35488b3943426a"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.816228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" event={"ID":"39556510-7df7-4b2f-94d6-75b649060c22","Type":"ContainerStarted","Data":"c1799172fcbf1d7e30ac93fa777d3291701003b3efce3a764bd729a9bf9db293"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818399 4755 generic.go:334] "Generic (PLEG): container finished" podID="45c99095-eab0-49c4-8ded-fc5359b43ef2" containerID="878f489e5b623b310bd55ab79b117a4c08d12b9f37999d42abdfb061f8c34b5d" exitCode=0 Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerDied","Data":"878f489e5b623b310bd55ab79b117a4c08d12b9f37999d42abdfb061f8c34b5d"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.818702 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerStarted","Data":"098c7922b1ce945e2390d6d1d82aeb2786b9b7ba4510d95d38765df68187a963"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.821078 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"5b9082ef3d4aa939e6513b01b28f98aba4649191aee5ec16458e3f590fc2aeab"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.823646 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" event={"ID":"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d","Type":"ContainerStarted","Data":"ff675657f16815ecc86e4757b1af8ea00efd122872cb67c077719f8d26daf593"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.823709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" event={"ID":"a0bdc24c-c8d3-458e-8cf8-d91164ef2b9d","Type":"ContainerStarted","Data":"39ca80287e557f4d0dfda4aa67453cdcbe180e08244d5a6543e65e1802f1a970"} Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.836067 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.853864 4755 request.go:700] Waited for 1.018642805s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.877368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msjhl\" (UniqueName: \"kubernetes.io/projected/7e3cedcb-923f-4cf5-b344-dd3842309d39-kube-api-access-msjhl\") pod \"cluster-samples-operator-665b6dd947-4kg2b\" (UID: \"7e3cedcb-923f-4cf5-b344-dd3842309d39\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.894463 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"oauth-openshift-558db77b4-wpj5p\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.916543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltjzz\" (UniqueName: \"kubernetes.io/projected/9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8-kube-api-access-ltjzz\") pod \"downloads-7954f5f757-l4v7x\" (UID: \"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8\") " pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.931439 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4qp2\" (UniqueName: \"kubernetes.io/projected/efeb6afa-e175-4bad-a0bb-5ace61619959-kube-api-access-p4qp2\") pod \"openshift-config-operator-7777fb866f-lp7qq\" (UID: \"efeb6afa-e175-4bad-a0bb-5ace61619959\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.954406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhcpn\" (UniqueName: \"kubernetes.io/projected/1fdd6691-9136-43ba-abea-7ba6862e9681-kube-api-access-mhcpn\") pod \"machine-api-operator-5694c8668f-4zdx6\" (UID: \"1fdd6691-9136-43ba-abea-7ba6862e9681\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.969672 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bq5qt\" (UniqueName: \"kubernetes.io/projected/5f67c724-386b-4736-ace1-73430edd3558-kube-api-access-bq5qt\") pod \"dns-operator-744455d44c-64g8c\" (UID: \"5f67c724-386b-4736-ace1-73430edd3558\") " pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.976140 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.987970 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.994393 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" Mar 20 13:34:13 crc kubenswrapper[4755]: I0320 13:34:13.996973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.006899 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.016378 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.018574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.024705 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.036505 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.044464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.046116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-profile-collector-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.056973 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.058983 4755 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059089 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert podName:673ae012-3e48-4408-8a01-a67833cabd26 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:14.559063942 +0000 UTC m=+234.156996471 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert") pod "catalog-operator-68c6474976-h4jf2" (UID: "673ae012-3e48-4408-8a01-a67833cabd26") : failed to sync secret cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059272 4755 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: E0320 13:34:14.059941 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume podName:4b90540c-9ef1-478a-a7a1-48817d0c63d0 nodeName:}" failed. No retries permitted until 2026-03-20 13:34:14.559930348 +0000 UTC m=+234.157862877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume") pod "collect-profiles-29566890-k7h8h" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.077007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.095769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.118743 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.139638 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.156553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.175391 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.198148 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.216554 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.221197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.236529 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.257133 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.264221 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.276794 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.289738 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4zdx6"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.297156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.299117 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq"] Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.314714 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefeb6afa_e175_4bad_a0bb_5ace61619959.slice/crio-68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640 WatchSource:0}: Error finding container 68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640: Status 404 returned error can't find the container with id 68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640 Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.317251 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fdd6691_9136_43ba_abea_7ba6862e9681.slice/crio-ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89 WatchSource:0}: Error finding container ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89: Status 404 returned error can't find the container with id ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.317291 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.335568 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.356604 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.365943 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-l4v7x"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.376984 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.393897 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8d45e6_cf9d_4f6f_b459_efe220bbf6d8.slice/crio-d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d WatchSource:0}: Error finding container d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d: Status 404 returned error can't find the container with id d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.395713 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.416740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.437090 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.447215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-64g8c"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.458129 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: W0320 13:34:14.460181 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f67c724_386b_4736_ace1_73430edd3558.slice/crio-6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753 WatchSource:0}: Error finding container 6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753: Status 404 returned error can't find the container with id 6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.476426 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.497182 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.516330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.537637 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.556181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.569988 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b"] Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.575456 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.585216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.585320 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.586243 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.593033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/673ae012-3e48-4408-8a01-a67833cabd26-srv-cert\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.597133 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.616106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.637581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.655091 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54506: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.666729 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.696992 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.715059 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.745013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.750798 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54522: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.754737 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.776889 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.793073 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54528: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.795398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.820940 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.831286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"88aa621b295c782e88050d005ee25a67f87afd469bce1ac8a057660b2326fafa"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.831354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"6fb90949be58afd1a2fbe86b2b26c3a1e57cad1706a6b642d6f25cded95b4753"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.832914 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"96510e9c3614e05b5cac667c321981d56cc16b49515f280c0404f0f02d7e0f61"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.832939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"d67a003c3dab3abc4dedddd843659ac5a0416d89137a7d6ebf9505449b28b2c2"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.834677 4755 generic.go:334] "Generic (PLEG): container finished" podID="b41fdebf-1886-4b30-b583-368242316562" containerID="a87ca0fc084eebae1508d13df991c9fdf881a46c04c1ca309ae21ad137c9ac71" exitCode=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.834724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerDied","Data":"a87ca0fc084eebae1508d13df991c9fdf881a46c04c1ca309ae21ad137c9ac71"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"0c703fd3704b0dc3498707636368a0438967cba95c29e00b3d89f6b05a93ae25"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"9f31d7a336050576f16fb2c470c156191647fe965629887bacb3403c6fce0422"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.838211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" event={"ID":"1fdd6691-9136-43ba-abea-7ba6862e9681","Type":"ContainerStarted","Data":"ce75b918dd5f9b6881df65b686cfd19cc28fe7b1b11ba9697c1fcdd706f94c89"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.840388 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.842337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerStarted","Data":"275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.842368 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerStarted","Data":"9edc35520733cdbb8ffbbdcc2f02ec6ef4e5e7ada3cc88f2fa7d388e53bb80dd"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.843046 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.844995 4755 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-wpj5p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" start-of-body= Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.845063 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.11:6443/healthz\": dial tcp 10.217.0.11:6443: connect: connection refused" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.846644 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l4v7x" event={"ID":"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8","Type":"ContainerStarted","Data":"915eb530da816e9e17072047fc4a554dcb3ad3d7ada42dbbd3ff4edffb783862"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.846722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-l4v7x" event={"ID":"9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8","Type":"ContainerStarted","Data":"d9129aa43458a10d1d6430b278b58d5e132a9dc2fa644c6af419a797865abc5d"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.847783 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.848944 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.848986 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.850955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" event={"ID":"45c99095-eab0-49c4-8ded-fc5359b43ef2","Type":"ContainerStarted","Data":"5f61f4cc6b6288ecfe2b511eff06c1b8d1e7228a39df7183bd2343f89dcfafec"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.857098 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54544: no serving certificate available for the kubelet" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.858004 4755 request.go:700] Waited for 1.912358624s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861185 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerStarted","Data":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861353 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.861835 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864435 4755 generic.go:334] "Generic (PLEG): container finished" podID="efeb6afa-e175-4bad-a0bb-5ace61619959" containerID="5ff09c4938c795e7be70b5bea1bdf0083d2e9b549cb47503854dc79b5ee1c65a" exitCode=0 Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864959 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerDied","Data":"5ff09c4938c795e7be70b5bea1bdf0083d2e9b549cb47503854dc79b5ee1c65a"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.864987 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerStarted","Data":"68169bebc74f84b8f4ca488e0f4d2f775d8627d20ca7c3eb126232148881a640"} Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.875042 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.880345 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.931187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2l2c\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-kube-api-access-v2l2c\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.935841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/862bcbad-0c15-4e2d-b205-83ab3721cd9a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-mw47r\" (UID: \"862bcbad-0c15-4e2d-b205-83ab3721cd9a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.938060 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.963254 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.975290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"console-f9d7485db-rb5zn\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.995280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/04d33b23-44ac-48b5-8981-fe9a764b1bee-bound-sa-token\") pod \"ingress-operator-5b745b69d9-lth4v\" (UID: \"04d33b23-44ac-48b5-8981-fe9a764b1bee\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:14 crc kubenswrapper[4755]: I0320 13:34:14.996086 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54552: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.002127 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.025228 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p4pm\" (UniqueName: \"kubernetes.io/projected/1b9cfbce-3f17-4155-a022-243e6d220bf8-kube-api-access-6p4pm\") pod \"cluster-image-registry-operator-dc59b4c8b-88jhp\" (UID: \"1b9cfbce-3f17-4155-a022-243e6d220bf8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.046323 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpd9k\" (UniqueName: \"kubernetes.io/projected/8ff5ba16-93f9-4313-a857-23a1c87c1cac-kube-api-access-lpd9k\") pod \"console-operator-58897d9998-6zgr4\" (UID: \"8ff5ba16-93f9-4313-a857-23a1c87c1cac\") " pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.062928 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwwp\" (UniqueName: \"kubernetes.io/projected/c452cf60-67a1-434f-b2da-7e81992e28a6-kube-api-access-qnwwp\") pod \"migrator-59844c95c7-ggscl\" (UID: \"c452cf60-67a1-434f-b2da-7e81992e28a6\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.083063 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54566: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.084056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"collect-profiles-29566890-k7h8h\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.107571 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.133696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86bzg\" (UniqueName: \"kubernetes.io/projected/673ae012-3e48-4408-8a01-a67833cabd26-kube-api-access-86bzg\") pod \"catalog-operator-68c6474976-h4jf2\" (UID: \"673ae012-3e48-4408-8a01-a67833cabd26\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.137675 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.141624 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b27e760-b22d-415a-93cc-866c2471ee63-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vdgvl\" (UID: \"8b27e760-b22d-415a-93cc-866c2471ee63\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.154600 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.181282 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54580: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.186424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.186488 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.200037 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.219337 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.256715 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.269812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.278119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.296230 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306643 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306676 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306712 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306740 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306755 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306797 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306848 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306865 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306886 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.306982 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307001 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307020 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307037 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307054 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.307598 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309533 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.309811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310147 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.310357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.311931 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.312711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313595 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313668 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313774 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.313793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314064 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314086 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.314109 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.315155 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.815142081 +0000 UTC m=+235.413074610 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.315864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316351 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.316821 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.361539 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54582: no serving certificate available for the kubelet" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.385056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.394992 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-l4v7x" podStartSLOduration=163.394967753 podStartE2EDuration="2m43.394967753s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:15.33677151 +0000 UTC m=+234.934704049" watchObservedRunningTime="2026-03-20 13:34:15.394967753 +0000 UTC m=+234.992900282" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.418857 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419472 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419592 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419765 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.419935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.449001 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.948964298 +0000 UTC m=+235.546896827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456252 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456378 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456424 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456496 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456675 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456755 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456845 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456866 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456899 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456921 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.456984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457022 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457058 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457082 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457197 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457237 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457323 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457353 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457408 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457447 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457497 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457567 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457630 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457664 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457713 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457743 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457817 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457941 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457962 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.457993 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.458021 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.459563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/14060dd8-a97a-404b-9020-9f9e519e78d9-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.474603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-key\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.476669 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8def433-c490-4469-9e43-12ba06428091-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.477075 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8def433-c490-4469-9e43-12ba06428091-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.477382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-service-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.478366 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-signing-cabundle\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.480965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-auth-proxy-config\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.482020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-tmpfs\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.484162 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-serving-cert\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.492435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-stats-auth\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.492553 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-metrics-certs\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/8b65e162-155e-4d40-ab1a-e3560b29f19f-default-certificate\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-config\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.493843 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-ca\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.494741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-profile-collector-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/85fb2982-9af0-4450-80f4-12fbd6e7a590-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.495513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8b65e162-155e-4d40-ab1a-e3560b29f19f-service-ca-bundle\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.496427 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:15.996409414 +0000 UTC m=+235.594341943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.497074 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-webhook-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.498543 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/903404df-f7c6-46d5-9227-748ecc920ac3-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499451 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8774c8e-1dd9-481c-9091-85a2fe704069-proxy-tls\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.499868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e8774c8e-1dd9-481c-9091-85a2fe704069-images\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.500191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.500227 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0e40521a-c254-4fd5-99e8-1296dd288e2d-config\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.506355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9f788c2-2578-4d14-9c8f-115f15a5a817-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.507951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-config\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.510931 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.515450 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.518855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0e40521a-c254-4fd5-99e8-1296dd288e2d-etcd-client\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.523158 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/14060dd8-a97a-404b-9020-9f9e519e78d9-proxy-tls\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.526629 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbts8\" (UniqueName: \"kubernetes.io/projected/eb9a014d-9a58-4461-adc6-2ee3981782a3-kube-api-access-tbts8\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.526698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.528136 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.528548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-serving-cert\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.530096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl97m\" (UniqueName: \"kubernetes.io/projected/5f249077-e650-4ad5-b008-7af17910535a-kube-api-access-dl97m\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.530951 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-certs\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.531455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.531584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/eb9a014d-9a58-4461-adc6-2ee3981782a3-srv-cert\") pod \"olm-operator-6b444d44fb-rtzzb\" (UID: \"eb9a014d-9a58-4461-adc6-2ee3981782a3\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-apiservice-cert\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532501 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9wh\" (UniqueName: \"kubernetes.io/projected/c9f788c2-2578-4d14-9c8f-115f15a5a817-kube-api-access-5c9wh\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.532844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5f249077-e650-4ad5-b008-7af17910535a-node-bootstrap-token\") pod \"machine-config-server-2b4nn\" (UID: \"5f249077-e650-4ad5-b008-7af17910535a\") " pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.533689 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzt4h\" (UniqueName: \"kubernetes.io/projected/8b65e162-155e-4d40-ab1a-e3560b29f19f-kube-api-access-mzt4h\") pod \"router-default-5444994796-r48mq\" (UID: \"8b65e162-155e-4d40-ab1a-e3560b29f19f\") " pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.534121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9f788c2-2578-4d14-9c8f-115f15a5a817-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8xtw2\" (UID: \"c9f788c2-2578-4d14-9c8f-115f15a5a817\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.539832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxhqb\" (UniqueName: \"kubernetes.io/projected/e8774c8e-1dd9-481c-9091-85a2fe704069-kube-api-access-lxhqb\") pod \"machine-config-operator-74547568cd-52kqn\" (UID: \"e8774c8e-1dd9-481c-9091-85a2fe704069\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.540468 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559419 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.559587 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.059557457 +0000 UTC m=+235.657489986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559806 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559828 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559950 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.559979 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560052 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.560068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.561001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af19a889-4a85-42c6-aafa-6714754c5a86-config-volume\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.563100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-csi-data-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.564173 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.064125016 +0000 UTC m=+235.662057625 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564639 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-plugins-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-registration-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-mountpoint-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.564838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da3395eb-3396-4bbc-8a18-3d57519c4667-socket-dir\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.571717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/51b44b44-8a09-430a-ba3c-92e2c2f916f6-cert\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.572563 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/af19a889-4a85-42c6-aafa-6714754c5a86-metrics-tls\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.573352 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lprp\" (UniqueName: \"kubernetes.io/projected/27e50d13-5c93-4dd7-a2c8-7ba505e2f549-kube-api-access-2lprp\") pod \"service-ca-operator-777779d784-zkdvd\" (UID: \"27e50d13-5c93-4dd7-a2c8-7ba505e2f549\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.616143 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"auto-csr-approver-29566894-tzlc5\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.622083 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.628087 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzcxx\" (UniqueName: \"kubernetes.io/projected/4ea64dfa-8103-47fc-9ad3-693b033a1ec1-kube-api-access-gzcxx\") pod \"package-server-manager-789f6589d5-22vns\" (UID: \"4ea64dfa-8103-47fc-9ad3-693b033a1ec1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.631542 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.647825 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpm2r\" (UniqueName: \"kubernetes.io/projected/85fb2982-9af0-4450-80f4-12fbd6e7a590-kube-api-access-zpm2r\") pod \"control-plane-machine-set-operator-78cbb6b69f-2bm86\" (UID: \"85fb2982-9af0-4450-80f4-12fbd6e7a590\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.648001 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvxg\" (UniqueName: \"kubernetes.io/projected/14060dd8-a97a-404b-9020-9f9e519e78d9-kube-api-access-cvvxg\") pod \"machine-config-controller-84d6567774-jxcch\" (UID: \"14060dd8-a97a-404b-9020-9f9e519e78d9\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.655227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.662187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.663012 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.162985578 +0000 UTC m=+235.760918107 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.673518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"marketplace-operator-79b997595-229g6\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.678248 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.685577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-2b4nn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.699994 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b56ded58-9184-4d39-b422-9ea9e8f6b9ea-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-4n95j\" (UID: \"b56ded58-9184-4d39-b422-9ea9e8f6b9ea\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.719369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.723142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ln85d\" (UniqueName: \"kubernetes.io/projected/c5e0183e-e0f5-4b89-a2f9-27fc07783e27-kube-api-access-ln85d\") pod \"packageserver-d55dfcdfc-m66xn\" (UID: \"c5e0183e-e0f5-4b89-a2f9-27fc07783e27\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.726043 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9x8r\" (UniqueName: \"kubernetes.io/projected/d8def433-c490-4469-9e43-12ba06428091-kube-api-access-x9x8r\") pod \"kube-storage-version-migrator-operator-b67b599dd-cc2rk\" (UID: \"d8def433-c490-4469-9e43-12ba06428091\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.768785 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.769175 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.269160143 +0000 UTC m=+235.867092672 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.769547 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.779890 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl"] Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.793970 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"auto-csr-approver-29566892-xh9lg\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.804116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257ws\" (UniqueName: \"kubernetes.io/projected/1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9-kube-api-access-257ws\") pod \"service-ca-9c57cc56f-sgkhb\" (UID: \"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9\") " pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.824584 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.825520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9wx2\" (UniqueName: \"kubernetes.io/projected/903404df-f7c6-46d5-9227-748ecc920ac3-kube-api-access-t9wx2\") pod \"multus-admission-controller-857f4d67dd-5h4zh\" (UID: \"903404df-f7c6-46d5-9227-748ecc920ac3\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.831341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.842346 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.851090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.858012 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.875803 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.876175 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.877758 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.377731771 +0000 UTC m=+235.975664300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.878508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lj9f\" (UniqueName: \"kubernetes.io/projected/0e40521a-c254-4fd5-99e8-1296dd288e2d-kube-api-access-4lj9f\") pod \"etcd-operator-b45778765-qkxhv\" (UID: \"0e40521a-c254-4fd5-99e8-1296dd288e2d\") " pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.885116 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbk6\" (UniqueName: \"kubernetes.io/projected/af19a889-4a85-42c6-aafa-6714754c5a86-kube-api-access-4qbk6\") pod \"dns-default-2rw7x\" (UID: \"af19a889-4a85-42c6-aafa-6714754c5a86\") " pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:15 crc kubenswrapper[4755]: W0320 13:34:15.885756 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b65e162_155e_4d40_ab1a_e3560b29f19f.slice/crio-8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b WatchSource:0}: Error finding container 8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b: Status 404 returned error can't find the container with id 8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.894499 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.898400 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccmvv\" (UniqueName: \"kubernetes.io/projected/51b44b44-8a09-430a-ba3c-92e2c2f916f6-kube-api-access-ccmvv\") pod \"ingress-canary-r96g9\" (UID: \"51b44b44-8a09-430a-ba3c-92e2c2f916f6\") " pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.904354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"d9510fbd56632079c0f63151bc7c72dd4b0c1b7c506093667a48f1b183b8afe1"} Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.907348 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.913900 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.923523 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.939735 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.940831 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.951428 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6px\" (UniqueName: \"kubernetes.io/projected/da3395eb-3396-4bbc-8a18-3d57519c4667-kube-api-access-sm6px\") pod \"csi-hostpathplugin-hpj2j\" (UID: \"da3395eb-3396-4bbc-8a18-3d57519c4667\") " pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.981035 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:15 crc kubenswrapper[4755]: E0320 13:34:15.981765 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.481725729 +0000 UTC m=+236.079658258 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:15 crc kubenswrapper[4755]: I0320 13:34:15.992047 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerStarted","Data":"a883711492469aba5080025f39ee56d456d68c7d62a0b2da2289bad36e4ed8ea"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.005578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.017035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" event={"ID":"862bcbad-0c15-4e2d-b205-83ab3721cd9a","Type":"ContainerStarted","Data":"039f9c39c017712ba47cb122b137bf255d2e7b389297efbc6e37c0a64c24630c"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.039946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" event={"ID":"7e3cedcb-923f-4cf5-b344-dd3842309d39","Type":"ContainerStarted","Data":"1caf15d648ab356ba8ce29ae6656d1d61a4848069f486150e3c30cb8afffaf10"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.052481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-r96g9" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.068008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" event={"ID":"5f67c724-386b-4736-ace1-73430edd3558","Type":"ContainerStarted","Data":"136c9719504ddddb3a72818721408e17db6a72d62ab8121d2be5854433e1e5af"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.078489 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.082322 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.086951 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54598: no serving certificate available for the kubelet" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.088227 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" event={"ID":"efeb6afa-e175-4bad-a0bb-5ace61619959","Type":"ContainerStarted","Data":"c4cff8e8d91f9f04f81c77d0489fb00ab6a37272797923c9c73d89dd5bbdff5d"} Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.091646 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.091781 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.093198 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.100084 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4zdx6" podStartSLOduration=164.100053264 podStartE2EDuration="2m44.100053264s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.094563577 +0000 UTC m=+235.692496106" watchObservedRunningTime="2026-03-20 13:34:16.100053264 +0000 UTC m=+235.697985793" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.108114 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.608066928 +0000 UTC m=+236.205999457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.189694 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.196392 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.696371239 +0000 UTC m=+236.294303768 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.293081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.293636 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.793617471 +0000 UTC m=+236.391550000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.313122 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-4kg9k" podStartSLOduration=164.313101325 podStartE2EDuration="2m44.313101325s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.309609619 +0000 UTC m=+235.907542148" watchObservedRunningTime="2026-03-20 13:34:16.313101325 +0000 UTC m=+235.911033854" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.341171 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.376586 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podStartSLOduration=164.376566189 podStartE2EDuration="2m44.376566189s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.355761845 +0000 UTC m=+235.953694374" watchObservedRunningTime="2026-03-20 13:34:16.376566189 +0000 UTC m=+235.974498718" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.395226 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.395583 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.895572288 +0000 UTC m=+236.493504817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.414505 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.426368 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-6zgr4"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.490324 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" podStartSLOduration=164.490302674 podStartE2EDuration="2m44.490302674s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.471505901 +0000 UTC m=+236.069438420" watchObservedRunningTime="2026-03-20 13:34:16.490302674 +0000 UTC m=+236.088235203" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.497037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.497708 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:16.997690409 +0000 UTC m=+236.595622938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.531447 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.550772 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.553566 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podStartSLOduration=164.55354181 podStartE2EDuration="2m44.55354181s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.541841184 +0000 UTC m=+236.139773713" watchObservedRunningTime="2026-03-20 13:34:16.55354181 +0000 UTC m=+236.151474339" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.558830 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.570678 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.600123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.600718 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.100703838 +0000 UTC m=+236.698636357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: W0320 13:34:16.637863 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d33b23_44ac_48b5_8981_fe9a764b1bee.slice/crio-3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46 WatchSource:0}: Error finding container 3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46: Status 404 returned error can't find the container with id 3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46 Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.711426 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.711759 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.211732391 +0000 UTC m=+236.809664920 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.712301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.712698 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.212680959 +0000 UTC m=+236.810613488 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: W0320 13:34:16.761847 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b90540c_9ef1_478a_a7a1_48817d0c63d0.slice/crio-d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65 WatchSource:0}: Error finding container d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65: Status 404 returned error can't find the container with id d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65 Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.815948 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.816662 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.316631816 +0000 UTC m=+236.914564345 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.833227 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podStartSLOduration=163.833200201 podStartE2EDuration="2m43.833200201s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:16.808834678 +0000 UTC m=+236.406767197" watchObservedRunningTime="2026-03-20 13:34:16.833200201 +0000 UTC m=+236.431132730" Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.836892 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.910370 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2"] Mar 20 13:34:16 crc kubenswrapper[4755]: I0320 13:34:16.918884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:16 crc kubenswrapper[4755]: E0320 13:34:16.919319 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.419304855 +0000 UTC m=+237.017237384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.023444 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.036337 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86"] Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.026269 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.526245682 +0000 UTC m=+237.124178221 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.026153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.037043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.037732 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.537712192 +0000 UTC m=+237.135644721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.058234 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.077639 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-xl5tr" podStartSLOduration=165.077611207 podStartE2EDuration="2m45.077611207s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.075773752 +0000 UTC m=+236.673706291" watchObservedRunningTime="2026-03-20 13:34:17.077611207 +0000 UTC m=+236.675543736" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.110326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.168778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.171399 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.171511 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.671486108 +0000 UTC m=+237.269418647 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.171935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.172294 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.672268621 +0000 UTC m=+237.270201150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.181934 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.190506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" event={"ID":"8b27e760-b22d-415a-93cc-866c2471ee63","Type":"ContainerStarted","Data":"52e70f2c08587f234542371eb63df44cff34c53fc242963b2b61f887897d2d3a"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.214163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" event={"ID":"b41fdebf-1886-4b30-b583-368242316562","Type":"ContainerStarted","Data":"7fdfa8cc5c6e3a309f83f429b136914c6b11eb49f4f2ee209076e8791b5023e5"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.235703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerStarted","Data":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.256345 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"3d62b373b1652ae42e3c1fad52de2c081bde6d8555bf30277b4af91a9e5f1d46"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.269449 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" event={"ID":"1b9cfbce-3f17-4155-a022-243e6d220bf8","Type":"ContainerStarted","Data":"4b461d0707325d4a02f122a17e58f9e5a84ef93592d659cd6f0a39fa66be7fd6"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.272814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.274248 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.774224658 +0000 UTC m=+237.372157177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.288199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" event={"ID":"673ae012-3e48-4408-8a01-a67833cabd26","Type":"ContainerStarted","Data":"4836923cd199d8b812b8adba92d36b4353a73b0cb19ba9b3e51b5138b611783f"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.305703 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"2ee3f579f873566cac82cfed106e8f9f2d43da1d9062ee0e8d7200af628eac14"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.309239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" event={"ID":"862bcbad-0c15-4e2d-b205-83ab3721cd9a","Type":"ContainerStarted","Data":"ba74282b9aff06f76885c33bcf2503d8ea4ba5de4cc982b7c5f67a676592536e"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.360367 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r48mq" event={"ID":"8b65e162-155e-4d40-ab1a-e3560b29f19f","Type":"ContainerStarted","Data":"8c45dd8423462870cdedd6c6b705c904661b98a7d56d6751675a44638aca219b"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.367937 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.377426 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.379461 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.879433293 +0000 UTC m=+237.477365902 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.380002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"9a0dc9512c594f8b5847c36bf313d8eebf2ece5ed2e39f0f3859a4f9db6478b4"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.381707 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2b4nn" event={"ID":"5f249077-e650-4ad5-b008-7af17910535a","Type":"ContainerStarted","Data":"13eb28fc506d56eb2a8ad7e822339124caa3dc1366c6165033d1f585862c3d02"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.383081 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-h4gd5" podStartSLOduration=165.383068973 podStartE2EDuration="2m45.383068973s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.37309634 +0000 UTC m=+236.971028869" watchObservedRunningTime="2026-03-20 13:34:17.383068973 +0000 UTC m=+236.981001502" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.411077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" event={"ID":"c9f788c2-2578-4d14-9c8f-115f15a5a817","Type":"ContainerStarted","Data":"73959380ea3e1f8bd230283269d87862536d87ab6399359926234c24e12e021d"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.431549 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54288: no serving certificate available for the kubelet" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.441449 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerStarted","Data":"d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.457364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" event={"ID":"8ff5ba16-93f9-4313-a857-23a1c87c1cac","Type":"ContainerStarted","Data":"77353dcb09a108108eb7a0aefc5e69dc9220ee5f614fd242f9b082fbca5c1150"} Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.459379 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.459423 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.493010 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.499675 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:17.999628915 +0000 UTC m=+237.597561444 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.573098 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn"] Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.607141 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.608479 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.108459301 +0000 UTC m=+237.706391830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.709803 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-64g8c" podStartSLOduration=165.709781448 podStartE2EDuration="2m45.709781448s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.709012445 +0000 UTC m=+237.306944974" watchObservedRunningTime="2026-03-20 13:34:17.709781448 +0000 UTC m=+237.307713977" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.710265 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.210238501 +0000 UTC m=+237.808171020 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.710170 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.711025 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.711422 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.211413868 +0000 UTC m=+237.809346397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.790525 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rb5zn" podStartSLOduration=165.790499857 podStartE2EDuration="2m45.790499857s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.746068863 +0000 UTC m=+237.344001392" watchObservedRunningTime="2026-03-20 13:34:17.790499857 +0000 UTC m=+237.388432386" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.812316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.813129 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.313092725 +0000 UTC m=+237.911025284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.835111 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" podStartSLOduration=165.835086676 podStartE2EDuration="2m45.835086676s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.791075634 +0000 UTC m=+237.389008173" watchObservedRunningTime="2026-03-20 13:34:17.835086676 +0000 UTC m=+237.433019205" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.880651 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.883221 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.897685 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-mw47r" podStartSLOduration=165.897645511 podStartE2EDuration="2m45.897645511s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.897215378 +0000 UTC m=+237.495147907" watchObservedRunningTime="2026-03-20 13:34:17.897645511 +0000 UTC m=+237.495578040" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.920071 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.921708 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:17 crc kubenswrapper[4755]: E0320 13:34:17.923153 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.423133678 +0000 UTC m=+238.021066207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.982468 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4kg2b" podStartSLOduration=165.982445845 podStartE2EDuration="2m45.982445845s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:17.980051131 +0000 UTC m=+237.577983660" watchObservedRunningTime="2026-03-20 13:34:17.982445845 +0000 UTC m=+237.580378374" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.990154 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:17 crc kubenswrapper[4755]: I0320 13:34:17.990493 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.022772 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.023537 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.523519966 +0000 UTC m=+238.121452495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.069929 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" podStartSLOduration=166.06990617 podStartE2EDuration="2m46.06990617s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.023871657 +0000 UTC m=+237.621804186" watchObservedRunningTime="2026-03-20 13:34:18.06990617 +0000 UTC m=+237.667838699" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.110070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.124801 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.125852 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.625832624 +0000 UTC m=+238.223765153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.141495 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-sgkhb"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.212468 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-5h4zh"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.225940 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.226601 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.726586783 +0000 UTC m=+238.324519312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.333536 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.334467 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.834446569 +0000 UTC m=+238.432379098 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.445885 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-r96g9"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.446251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.446679 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:18.946646088 +0000 UTC m=+238.544578617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.491179 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-hpj2j"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.512984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-2b4nn" event={"ID":"5f249077-e650-4ad5-b008-7af17910535a","Type":"ContainerStarted","Data":"c150f0bbdb95baf7383c53f6a3f6a014b41ffd0c8068ffbdf9a0de37fc2d3222"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.518364 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-r48mq" event={"ID":"8b65e162-155e-4d40-ab1a-e3560b29f19f","Type":"ContainerStarted","Data":"7ed4d9d1329bb8b20ecfe08f82bd47710ddeaf41ded2652e97255ca0e97a27d0"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.549129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.551616 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.051596656 +0000 UTC m=+238.649529185 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.564292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerStarted","Data":"87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.591843 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.591939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" event={"ID":"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9","Type":"ContainerStarted","Data":"2a9190782406b02779e67dc2a7bd5c76d522825551f1d3835561e955bd4878a3"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.610222 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerStarted","Data":"d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.617044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" event={"ID":"1b9cfbce-3f17-4155-a022-243e6d220bf8","Type":"ContainerStarted","Data":"fb51a73fc7c80806a2832de9fd14858918cffb026fa6c342f519a7b50a828454"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.634188 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-2b4nn" podStartSLOduration=6.6341639610000005 podStartE2EDuration="6.634163961s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.555214615 +0000 UTC m=+238.153147144" watchObservedRunningTime="2026-03-20 13:34:18.634163961 +0000 UTC m=+238.232096480" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647571 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647689 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.647722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:18 crc kubenswrapper[4755]: W0320 13:34:18.658293 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf19a889_4a85_42c6_aafa_6714754c5a86.slice/crio-45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e WatchSource:0}: Error finding container 45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e: Status 404 returned error can't find the container with id 45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.677808 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.681053 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.688681 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2rw7x"] Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.694220 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.194159508 +0000 UTC m=+238.792092037 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.706220 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-qkxhv"] Mar 20 13:34:18 crc kubenswrapper[4755]: W0320 13:34:18.712375 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb56ded58_9184_4d39_b422_9ea9e8f6b9ea.slice/crio-73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11 WatchSource:0}: Error finding container 73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11: Status 404 returned error can't find the container with id 73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11 Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.759062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"c95ecfcf06991a49718f0ea2768a0a1b8bf011adcc8fa84dee5a421e761d426b"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.769109 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.771390 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-r48mq" podStartSLOduration=166.771366571 podStartE2EDuration="2m46.771366571s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.606990862 +0000 UTC m=+238.204923391" watchObservedRunningTime="2026-03-20 13:34:18.771366571 +0000 UTC m=+238.369299100" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.777797 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-88jhp" podStartSLOduration=166.777770056 podStartE2EDuration="2m46.777770056s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.659772401 +0000 UTC m=+238.257704950" watchObservedRunningTime="2026-03-20 13:34:18.777770056 +0000 UTC m=+238.375702585" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.778397 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j"] Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.786980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" event={"ID":"27e50d13-5c93-4dd7-a2c8-7ba505e2f549","Type":"ContainerStarted","Data":"665676f9bda8b22269ecdde850f1c1d577f3cf41c3859caa40359e6bf9f18eae"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.795446 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" event={"ID":"c9f788c2-2578-4d14-9c8f-115f15a5a817","Type":"ContainerStarted","Data":"98b7aec7ba10ef671338e0b35d8f190727c8b67519161949b38364b314942d6d"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.812600 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.813850 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.313832374 +0000 UTC m=+238.911764893 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.828781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" event={"ID":"eb9a014d-9a58-4461-adc6-2ee3981782a3","Type":"ContainerStarted","Data":"fb76caa1d2db3e98fe4b185f43deb98fbe90661bbf00761b719fc977a3879221"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.828853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" event={"ID":"eb9a014d-9a58-4461-adc6-2ee3981782a3","Type":"ContainerStarted","Data":"af29aee6314a51e0e0f3feacac2582b11ae13af8a02aa2e1e04d014d72c97020"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.829419 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.832576 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8xtw2" podStartSLOduration=166.832553035 podStartE2EDuration="2m46.832553035s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.832298187 +0000 UTC m=+238.430230736" watchObservedRunningTime="2026-03-20 13:34:18.832553035 +0000 UTC m=+238.430485564" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.837455 4755 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-rtzzb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.837529 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" podUID="eb9a014d-9a58-4461-adc6-2ee3981782a3" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.850967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" event={"ID":"85fb2982-9af0-4450-80f4-12fbd6e7a590","Type":"ContainerStarted","Data":"9fb27c6045547886f781e289c220e8d084d39fa06f1d39729ca9db18c3b8c0e1"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.851050 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" event={"ID":"85fb2982-9af0-4450-80f4-12fbd6e7a590","Type":"ContainerStarted","Data":"7a561f5bbe91e5fed7457d49d2a9ae216e382d0f9f3884fb58722e37f3431ce9"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.866695 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" podStartSLOduration=166.866641233 podStartE2EDuration="2m46.866641233s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.853033249 +0000 UTC m=+238.450965778" watchObservedRunningTime="2026-03-20 13:34:18.866641233 +0000 UTC m=+238.464573762" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.881959 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-2bm86" podStartSLOduration=166.881938539 podStartE2EDuration="2m46.881938539s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.877975309 +0000 UTC m=+238.475907828" watchObservedRunningTime="2026-03-20 13:34:18.881938539 +0000 UTC m=+238.479871058" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.887389 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" event={"ID":"8ff5ba16-93f9-4313-a857-23a1c87c1cac","Type":"ContainerStarted","Data":"be70966701d6cda176ee303d4d649e64a8e74c12777e4d4ee09a445be3d6ed5e"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.889605 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.903928 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-6zgr4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.903991 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podUID="8ff5ba16-93f9-4313-a857-23a1c87c1cac" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.909489 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" event={"ID":"673ae012-3e48-4408-8a01-a67833cabd26","Type":"ContainerStarted","Data":"151e814fabd7eeb84bc7535b6301244d51581b430e1a329c2d08cd114110497b"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.910307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.914413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.919286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"17bd9204ad3a0d50217b0efa90a5d14971587a5d314e7592eccdade678490148"} Mar 20 13:34:18 crc kubenswrapper[4755]: E0320 13:34:18.919298 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.419273487 +0000 UTC m=+239.017206016 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.931222 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podStartSLOduration=166.9312027 podStartE2EDuration="2m46.9312027s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.928110976 +0000 UTC m=+238.526043505" watchObservedRunningTime="2026-03-20 13:34:18.9312027 +0000 UTC m=+238.529135229" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.934521 4755 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-h4jf2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.934580 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" podUID="673ae012-3e48-4408-8a01-a67833cabd26" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.942245 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"ab6e8363864b1a804ddc99c0ed8403ec23675067240e77300a29abe09652e472"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.942305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"8ece1944cd19e7ac1137ed829e08f920fd86a6ab247cd9d747859a2dfa0c9e1c"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.949142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"8fc491215d42b479f9ba11e0ab94bf7f1a250c67800ed6bd25e629cda38bf227"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.953353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" event={"ID":"8b27e760-b22d-415a-93cc-866c2471ee63","Type":"ContainerStarted","Data":"e985a14b8183b03388e3baee0985df9660fce2dd7c4cf09d74c9a75004b92046"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.955438 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerStarted","Data":"c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df"} Mar 20 13:34:18 crc kubenswrapper[4755]: I0320 13:34:18.961053 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" podStartSLOduration=166.96103581 podStartE2EDuration="2m46.96103581s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.959680768 +0000 UTC m=+238.557613297" watchObservedRunningTime="2026-03-20 13:34:18.96103581 +0000 UTC m=+238.558968339" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.003116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" event={"ID":"c5e0183e-e0f5-4b89-a2f9-27fc07783e27","Type":"ContainerStarted","Data":"ee1bd8b4abe872c704afebc0bff51de7c8acb2686a5e91fbbc3df0cf4c48714b"} Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.003299 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.016280 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-92hxn" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.017235 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.019550 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.519518372 +0000 UTC m=+239.117450901 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.026368 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vdgvl" podStartSLOduration=167.026338269 podStartE2EDuration="2m47.026338269s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:18.985160394 +0000 UTC m=+238.583092923" watchObservedRunningTime="2026-03-20 13:34:19.026338269 +0000 UTC m=+238.624270798" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.034096 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m66xn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.034259 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podUID="c5e0183e-e0f5-4b89-a2f9-27fc07783e27" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.047208 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-lp7qq" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.048811 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" podStartSLOduration=167.048788873 podStartE2EDuration="2m47.048788873s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.047604997 +0000 UTC m=+238.645537526" watchObservedRunningTime="2026-03-20 13:34:19.048788873 +0000 UTC m=+238.646721402" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.049065 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" podStartSLOduration=167.049059741 podStartE2EDuration="2m47.049059741s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.0142078 +0000 UTC m=+238.612140329" watchObservedRunningTime="2026-03-20 13:34:19.049059741 +0000 UTC m=+238.646992270" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.120009 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.121946 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.62190839 +0000 UTC m=+239.219840909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.168871 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podStartSLOduration=167.168850711 podStartE2EDuration="2m47.168850711s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:19.160037643 +0000 UTC m=+238.757970192" watchObservedRunningTime="2026-03-20 13:34:19.168850711 +0000 UTC m=+238.766783240" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.223037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.224377 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.724354042 +0000 UTC m=+239.322286571 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.288148 4755 patch_prober.go:28] interesting pod/apiserver-76f77b778f-6ql2s container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]log ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]etcd ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/max-in-flight-filter ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 13:34:19 crc kubenswrapper[4755]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 13:34:19 crc kubenswrapper[4755]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:34:19 crc kubenswrapper[4755]: livez check failed Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.288242 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" podUID="b41fdebf-1886-4b30-b583-368242316562" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.329403 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.329568 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.829542326 +0000 UTC m=+239.427474855 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.329703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.330042 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.830029951 +0000 UTC m=+239.427962480 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.434519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.434647 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.934629648 +0000 UTC m=+239.532562177 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.435055 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.435559 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:19.935550536 +0000 UTC m=+239.533483065 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.537340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.537528 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.037500702 +0000 UTC m=+239.635433231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.538223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.539389 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.039376649 +0000 UTC m=+239.637309368 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.639759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.647099 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.147068611 +0000 UTC m=+239.745001140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.647322 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.647896 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.147885426 +0000 UTC m=+239.745817945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.668979 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:19 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:19 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.669083 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.752680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.753019 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.252977117 +0000 UTC m=+239.850909646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.753203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.753734 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.25372583 +0000 UTC m=+239.851658359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.854753 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.855426 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.355368417 +0000 UTC m=+239.953300956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.856124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.856623 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.356596564 +0000 UTC m=+239.954529093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.957401 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.957583 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.457548839 +0000 UTC m=+240.055481368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:19 crc kubenswrapper[4755]: I0320 13:34:19.958001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:19 crc kubenswrapper[4755]: E0320 13:34:19.958441 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.458425007 +0000 UTC m=+240.056357536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.059186 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.059438 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.559399053 +0000 UTC m=+240.157331582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.059597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.060039 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.560023202 +0000 UTC m=+240.157955721 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.065637 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54302: no serving certificate available for the kubelet" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.087162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" event={"ID":"1d48e3f4-e9d9-4249-ab69-30e3ca2c98f9","Type":"ContainerStarted","Data":"074f11e0e9f7a406a73c13235fca4b755abec8ffe9f594e1827d35673d23f526"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.092882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"26c43a8cf27365109f74a654174b902d4ce3f9384b9e7037104956dadbda0dcf"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.092940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"45a144e5531b39217c89b78ce7e05b04f6a2f9fafe4c7128988bc56ced14af7e"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.105618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" event={"ID":"04d33b23-44ac-48b5-8981-fe9a764b1bee","Type":"ContainerStarted","Data":"34e6784586bb9dbd83ead2f26100ea97ea937ff50931de64353977525ad70705"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.117334 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" event={"ID":"14060dd8-a97a-404b-9020-9f9e519e78d9","Type":"ContainerStarted","Data":"7e379a11cc9f18b070bfba4bf9f24341cf71c9ebadf67f86b32d424ed7cbfe6f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.124142 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-sgkhb" podStartSLOduration=167.124097134 podStartE2EDuration="2m47.124097134s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.122946038 +0000 UTC m=+239.720878567" watchObservedRunningTime="2026-03-20 13:34:20.124097134 +0000 UTC m=+239.722029663" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.136029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-ggscl" event={"ID":"c452cf60-67a1-434f-b2da-7e81992e28a6","Type":"ContainerStarted","Data":"562703a64193302ee027aec6434993d311409931f91424a8a3bc7f45352d8b93"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.174883 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.177388 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.677364467 +0000 UTC m=+240.275296996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.181952 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" event={"ID":"0e40521a-c254-4fd5-99e8-1296dd288e2d","Type":"ContainerStarted","Data":"1527d02eab2d2272d4b2383ebbd8ffe7459e690c29c19db34298ea1367d33353"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.182006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" event={"ID":"0e40521a-c254-4fd5-99e8-1296dd288e2d","Type":"ContainerStarted","Data":"36bf03bc21c96c081022e5457100171ea34a5f0c7e3891bd2a94a15d54fc11ec"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.195732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"c2e934a6dd55c7072b44da340f236e922e8823aa86ab19cc0bd48fdeb7c98ad1"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.195779 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" event={"ID":"e8774c8e-1dd9-481c-9091-85a2fe704069","Type":"ContainerStarted","Data":"d39ca1909e22b17c2b346fc275569453aa15b6c7c78769bd4ee043798e72356f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.207599 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-lth4v" podStartSLOduration=168.207571798 podStartE2EDuration="2m48.207571798s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.206827794 +0000 UTC m=+239.804760323" watchObservedRunningTime="2026-03-20 13:34:20.207571798 +0000 UTC m=+239.805504327" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.207765 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-jxcch" podStartSLOduration=168.207759103 podStartE2EDuration="2m48.207759103s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.174288243 +0000 UTC m=+239.772220772" watchObservedRunningTime="2026-03-20 13:34:20.207759103 +0000 UTC m=+239.805691642" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.213434 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"b569574f673b8dcaa9e3744f50b218f55339a7ccb1cb77efb3cb16f9547b1966"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.213497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" event={"ID":"903404df-f7c6-46d5-9227-748ecc920ac3","Type":"ContainerStarted","Data":"a1e27ddb2360de2885346f06685fb71d108322ddf8db9a5aab39b6a0caaeb336"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.229600 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" event={"ID":"d8def433-c490-4469-9e43-12ba06428091","Type":"ContainerStarted","Data":"d8f4b5903ec1b8d49a7f0244972cd97783793a7d76ccefe2e5e34e7d48888409"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.229673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" event={"ID":"d8def433-c490-4469-9e43-12ba06428091","Type":"ContainerStarted","Data":"cd02889b8f102d407e8cda11aebe9cac21194831626783b6dc4045596519d058"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"165a7083062ced917c24d885b7a0d89257d3da32d002f75a9e3c209f10e72f59"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"d5827cbd34536cc1d64209344c68c388f554f3d5f6e7218403fb14d4bdf44d46"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.252305 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" event={"ID":"4ea64dfa-8103-47fc-9ad3-693b033a1ec1","Type":"ContainerStarted","Data":"06c181e0fbc16493dcf66776b27278ff10f19457ee1e23f63560cbac303f5743"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.253015 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.271461 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-qkxhv" podStartSLOduration=168.271441763 podStartE2EDuration="2m48.271441763s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.268007138 +0000 UTC m=+239.865939667" watchObservedRunningTime="2026-03-20 13:34:20.271441763 +0000 UTC m=+239.869374292" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.274876 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r96g9" event={"ID":"51b44b44-8a09-430a-ba3c-92e2c2f916f6","Type":"ContainerStarted","Data":"6a470248db5bcdcbd851ec9d93aeae048e560d5e4f3964642d2bb0e261783b5f"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.274932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-r96g9" event={"ID":"51b44b44-8a09-430a-ba3c-92e2c2f916f6","Type":"ContainerStarted","Data":"1523344c4416ca2fa6a9d36b485c0350c987b557609e59798908e6ef70e640c3"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.277400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.280114 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.780088727 +0000 UTC m=+240.378021456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerStarted","Data":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291239 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerStarted","Data":"ffd17bcea5582e9144ff86b2de342c1b3c61951742cefde886baf98d6e66252d"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.291778 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.293700 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-229g6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.293752 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.308988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"e40bc158eb9411a59253f82e1e8bcbc74fd16a747555f5dfd4f844a70cd069a7"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.357177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" event={"ID":"27e50d13-5c93-4dd7-a2c8-7ba505e2f549","Type":"ContainerStarted","Data":"5a34fdf03efebf84f169fb8b7aa970bfd47809ef847318147656f3236950a798"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.385329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.387185 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.887161669 +0000 UTC m=+240.485094198 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.400472 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" event={"ID":"c5e0183e-e0f5-4b89-a2f9-27fc07783e27","Type":"ContainerStarted","Data":"e8aa715492c4c86ff9bf4336ddb65914dd8f892dbd26482bc316444af61d8881"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.446792 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-52kqn" podStartSLOduration=168.446759975 podStartE2EDuration="2m48.446759975s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.29959371 +0000 UTC m=+239.897526239" watchObservedRunningTime="2026-03-20 13:34:20.446759975 +0000 UTC m=+240.044692504" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.448339 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" event={"ID":"b56ded58-9184-4d39-b422-9ea9e8f6b9ea","Type":"ContainerStarted","Data":"c9390327472cf673d0b59be92f646bdef72fc5b3ee002719db11b571fd677600"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.448398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" event={"ID":"b56ded58-9184-4d39-b422-9ea9e8f6b9ea","Type":"ContainerStarted","Data":"73b9cd0af158df7358f39e0668c4d8d07746094e034f7c273afbc78eac3f9a11"} Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.450328 4755 patch_prober.go:28] interesting pod/console-operator-58897d9998-6zgr4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.450383 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" podUID="8ff5ba16-93f9-4313-a857-23a1c87c1cac" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/readyz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.471534 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-cc2rk" podStartSLOduration=168.471516539 podStartE2EDuration="2m48.471516539s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.426033233 +0000 UTC m=+240.023965762" watchObservedRunningTime="2026-03-20 13:34:20.471516539 +0000 UTC m=+240.069449068" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.490788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.491471 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:20.991449396 +0000 UTC m=+240.589381925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.495216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-h4jf2" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.510819 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-5h4zh" podStartSLOduration=168.510790385 podStartE2EDuration="2m48.510790385s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.474927533 +0000 UTC m=+240.072860062" watchObservedRunningTime="2026-03-20 13:34:20.510790385 +0000 UTC m=+240.108722914" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.512964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-rtzzb" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.581450 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-r96g9" podStartSLOduration=8.581424957 podStartE2EDuration="8.581424957s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.513179418 +0000 UTC m=+240.111111947" watchObservedRunningTime="2026-03-20 13:34:20.581424957 +0000 UTC m=+240.179357486" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.593452 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.595712 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.095679871 +0000 UTC m=+240.693612390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.627411 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podStartSLOduration=168.627377427 podStartE2EDuration="2m48.627377427s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.622130438 +0000 UTC m=+240.220062967" watchObservedRunningTime="2026-03-20 13:34:20.627377427 +0000 UTC m=+240.225309956" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.636142 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" podStartSLOduration=168.636102184 podStartE2EDuration="2m48.636102184s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.583259703 +0000 UTC m=+240.181192222" watchObservedRunningTime="2026-03-20 13:34:20.636102184 +0000 UTC m=+240.234034713" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.652022 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:20 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:20 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:20 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.652133 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.696690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.698606 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.198565946 +0000 UTC m=+240.796498475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.749547 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-zkdvd" podStartSLOduration=167.749520709 podStartE2EDuration="2m47.749520709s" podCreationTimestamp="2026-03-20 13:31:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.701759263 +0000 UTC m=+240.299691792" watchObservedRunningTime="2026-03-20 13:34:20.749520709 +0000 UTC m=+240.347453238" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.801329 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.801933 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.301917735 +0000 UTC m=+240.899850264 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.836809 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-4n95j" podStartSLOduration=168.836779247 podStartE2EDuration="2m48.836779247s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:20.828569887 +0000 UTC m=+240.426502416" watchObservedRunningTime="2026-03-20 13:34:20.836779247 +0000 UTC m=+240.434711776" Mar 20 13:34:20 crc kubenswrapper[4755]: I0320 13:34:20.903690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:20 crc kubenswrapper[4755]: E0320 13:34:20.904064 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.404049906 +0000 UTC m=+241.001982435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.006478 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.007469 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.507440917 +0000 UTC m=+241.105373456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.109416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.110319 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.610303031 +0000 UTC m=+241.208235560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.158920 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.160372 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.164341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.214708 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.714684661 +0000 UTC m=+241.312617190 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.214575 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.215001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.215493 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.715486385 +0000 UTC m=+241.313418914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.263326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.330764 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.338771 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.83873075 +0000 UTC m=+241.436663279 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339511 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.339554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.340165 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.840156623 +0000 UTC m=+241.438089152 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.378003 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.397899 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.400934 4755 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-m66xn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.401419 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" podUID="c5e0183e-e0f5-4b89-a2f9-27fc07783e27" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.402257 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.420351 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.441969 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.941939384 +0000 UTC m=+241.539871913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.441977 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.442756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.443273 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:21.943255485 +0000 UTC m=+241.541188014 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.443322 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.443397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.489150 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"73232c6917a3217823fbedbea721a4e33d2ce945f5dbad74eb27d3fb0c436a58"} Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.494016 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2rw7x" event={"ID":"af19a889-4a85-42c6-aafa-6714754c5a86","Type":"ContainerStarted","Data":"ae5b7ba16724b9e103a5e8ab55de8a2abf69a222f5e75301c5843aab39aee6df"} Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.495030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"certified-operators-shzbw\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.502477 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.514602 4755 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-229g6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.514986 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544301 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544544 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2rw7x" podStartSLOduration=9.5445247 podStartE2EDuration="9.5445247s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:21.541810097 +0000 UTC m=+241.139742626" watchObservedRunningTime="2026-03-20 13:34:21.5445247 +0000 UTC m=+241.142457219" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544699 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.544806 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.545275 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.045253622 +0000 UTC m=+241.643186151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.567843 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.568982 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.588884 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.602337 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-6zgr4" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.637990 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:21 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:21 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:21 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.638052 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.654935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.655986 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656103 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656742 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.656909 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.663571 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.163545776 +0000 UTC m=+241.761478535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.665953 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.666229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.691829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-m66xn" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.699953 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"community-operators-cgznb\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766622 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.766688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.767100 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.26708308 +0000 UTC m=+241.865015609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.767495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.767854 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.796459 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.796861 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.797572 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.803919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"certified-operators-d8rq7\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.830046 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.885931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.886316 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.386299723 +0000 UTC m=+241.984232252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.936084 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.988924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989332 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: I0320 13:34:21.989414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:21 crc kubenswrapper[4755]: E0320 13:34:21.989540 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.489518498 +0000 UTC m=+242.087451027 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090534 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090621 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.090706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.091619 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.092171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.092463 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.592451233 +0000 UTC m=+242.190383762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.097838 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.144097 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"community-operators-vm24m\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.149611 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.191518 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.192233 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.692210753 +0000 UTC m=+242.290143292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.193517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.300346 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.300856 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.800843223 +0000 UTC m=+242.398775742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.408477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.409606 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:22.909584076 +0000 UTC m=+242.507516605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.422148 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.422751 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" containerID="cri-o://621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" gracePeriod=30 Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.444228 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.444637 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" containerID="cri-o://f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" gracePeriod=30 Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.512930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.513376 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.013361127 +0000 UTC m=+242.611293656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.581332 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.603095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerStarted","Data":"0ab76dafe853da1151a253ddbccefd2f71d9bf47c5abfc10da67278b7f81253e"} Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.614411 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.614916 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.114894621 +0000 UTC m=+242.712827140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.633615 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:22 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:22 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:22 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.633714 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.694259 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.718803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.719207 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.219191218 +0000 UTC m=+242.817123747 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.821310 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.821859 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.321838506 +0000 UTC m=+242.919771045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.923374 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:22 crc kubenswrapper[4755]: E0320 13:34:22.923746 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.42372857 +0000 UTC m=+243.021661099 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.995487 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.996511 4755 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-g4ftg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 13:34:22 crc kubenswrapper[4755]: I0320 13:34:22.996575 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.010927 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-6ql2s" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.024141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.024711 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.524691755 +0000 UTC m=+243.122624284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.125819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.126350 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.626329893 +0000 UTC m=+243.224262422 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.141228 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157051 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.157413 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157429 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.157561 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.158539 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.163355 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.175892 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227451 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227608 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") pod \"d0eef306-2a08-40d1-82cf-ad6d81923c67\" (UID: \"d0eef306-2a08-40d1-82cf-ad6d81923c67\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227800 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.227873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229153 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229198 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config" (OuterVolumeSpecName: "config") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.229501 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.729304039 +0000 UTC m=+243.327236748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.229522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.244121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7" (OuterVolumeSpecName: "kube-api-access-tglm7") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "kube-api-access-tglm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.254179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0eef306-2a08-40d1-82cf-ad6d81923c67" (UID: "d0eef306-2a08-40d1-82cf-ad6d81923c67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.331747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332129 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332244 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332409 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0eef306-2a08-40d1-82cf-ad6d81923c67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332414 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332424 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332663 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tglm7\" (UniqueName: \"kubernetes.io/projected/d0eef306-2a08-40d1-82cf-ad6d81923c67-kube-api-access-tglm7\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332683 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.332697 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0eef306-2a08-40d1-82cf-ad6d81923c67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.333098 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.833079231 +0000 UTC m=+243.431011970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.334196 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.336004 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.351802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:34:23 crc kubenswrapper[4755]: W0320 13:34:23.394805 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184aa529_45c4_42c9_8eee_04bd18fba718.slice/crio-24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509 WatchSource:0}: Error finding container 24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509: Status 404 returned error can't find the container with id 24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.422218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"redhat-marketplace-929x7\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.438896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439439 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.439570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") pod \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\" (UID: \"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.441840 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:23.941802534 +0000 UTC m=+243.539735063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.442497 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config" (OuterVolumeSpecName: "config") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.442647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca" (OuterVolumeSpecName: "client-ca") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.446465 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd" (OuterVolumeSpecName: "kube-api-access-zdrkd") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "kube-api-access-zdrkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.446876 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" (UID: "53c8ae1f-e5a9-4ac8-8af7-2169378af3d2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.488200 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.553968 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554205 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554222 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554278 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdrkd\" (UniqueName: \"kubernetes.io/projected/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-kube-api-access-zdrkd\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.554293 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.554771 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.054728044 +0000 UTC m=+243.652660573 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.558334 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.559071 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.560159 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.574082 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerName="route-controller-manager" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.576802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.585458 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.636299 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:23 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:23 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:23 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.636370 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642814 4755 generic.go:334] "Generic (PLEG): container finished" podID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642899 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerDied","Data":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" event={"ID":"d0eef306-2a08-40d1-82cf-ad6d81923c67","Type":"ContainerDied","Data":"ea03e21c825372e4f508e4183f07bab9440aa36d8af7963578ed0bad5bcf3f8f"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.642948 4755 scope.go:117] "RemoveContainer" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.643115 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655591 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.655767 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.155737782 +0000 UTC m=+243.753670311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.655968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.656370 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.156353501 +0000 UTC m=+243.754286040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661477 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.661722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"28734b0d2914118b3d9d2819be5a8fd3a2768be1a04f071ed6cc45a5baf248f6"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.692600 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.692805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699069 4755 generic.go:334] "Generic (PLEG): container finished" podID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699168 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerDied","Data":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699207 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" event={"ID":"53c8ae1f-e5a9-4ac8-8af7-2169378af3d2","Type":"ContainerDied","Data":"3223931b89286699e698342d15f3a7a85a00629ea5fea33a44c5e665454a0198"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.699305 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.710514 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.715554 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-pz64x"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.719678 4755 scope.go:117] "RemoveContainer" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.722608 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": container with ID starting with 621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4 not found: ID does not exist" containerID="621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.723023 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4"} err="failed to get container status \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": rpc error: code = NotFound desc = could not find container \"621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4\": container with ID starting with 621e2761ce4e482c480741eacf521d7985e5a65870ea506b762f05e435ff4fa4 not found: ID does not exist" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.723061 4755 scope.go:117] "RemoveContainer" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750321 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750830 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.750915 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerStarted","Data":"24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.757794 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.758139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.759441 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.259422031 +0000 UTC m=+243.857354560 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.759847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.760558 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.762512 4755 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785008 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785136 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.785176 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerStarted","Data":"b82219efa86cff3e92cd1609c0f3a02dacbb886afd0558266c139f378ee30512"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.790718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"redhat-marketplace-nlslg\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.799520 4755 scope.go:117] "RemoveContainer" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.799890 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": container with ID starting with f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf not found: ID does not exist" containerID="f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.799926 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf"} err="failed to get container status \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": rpc error: code = NotFound desc = could not find container \"f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf\": container with ID starting with f2d1bd01e4b8a4746e3e240083fd8cb11e34a15700ff85250d88c704b0e18dcf not found: ID does not exist" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.800900 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"62fcbeafe490ddef0d169e1b68e282bff334419b04ae05d301d4286b25cdfb58"} Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.804931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.807577 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-g4ftg"] Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.861696 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.864883 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.364866203 +0000 UTC m=+243.962798742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.904114 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:34:23 crc kubenswrapper[4755]: W0320 13:34:23.935684 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d2017d2_f4ee_4056_b350_cc313f3faeaf.slice/crio-3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27 WatchSource:0}: Error finding container 3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27: Status 404 returned error can't find the container with id 3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27 Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.954165 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.963049 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:23 crc kubenswrapper[4755]: E0320 13:34:23.963508 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.463486027 +0000 UTC m=+244.061418556 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.979035 4755 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-pz64x container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: i/o timeout" start-of-body= Mar 20 13:34:23 crc kubenswrapper[4755]: I0320 13:34:23.979217 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-pz64x" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: i/o timeout" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:23.997844 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:23.999458 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.009851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.010135 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.019135 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026794 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026866 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.026803 4755 patch_prober.go:28] interesting pod/downloads-7954f5f757-l4v7x container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" start-of-body= Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.027242 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-l4v7x" podUID="9a8d45e6-cf9d-4f6f-b459-efe220bbf6d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.26:8080/\": dial tcp 10.217.0.26:8080: connect: connection refused" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.064903 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.065053 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.065133 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.065617 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.565603369 +0000 UTC m=+244.163535898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.167173 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.167383 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.667349019 +0000 UTC m=+244.265281548 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.168084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.170436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.170801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.170858 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.670847855 +0000 UTC m=+244.268780384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.171280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.199341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.273895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.274259 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.774241745 +0000 UTC m=+244.372174274 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.350801 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.353507 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361830 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361865 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.361933 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.362234 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.362746 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.363744 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.366374 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.367475 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.370069 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.370678 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.371010 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.371318 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372732 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.372861 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.375272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.375784 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.875764209 +0000 UTC m=+244.473696738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.382678 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.398724 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.468557 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.476247 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.976201878 +0000 UTC m=+244.574134407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.476978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477194 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477493 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.477534 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: E0320 13:34:24.478077 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:34:24.978061975 +0000 UTC m=+244.575994504 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-bckdl" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:34:24 crc kubenswrapper[4755]: W0320 13:34:24.478098 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce4d5763_1786_4b87_8497_0c65da46f446.slice/crio-bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e WatchSource:0}: Error finding container bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e: Status 404 returned error can't find the container with id bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.499719 4755 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:34:23.762541456Z","Handler":null,"Name":""} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.505067 4755 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.505133 4755 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.544924 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.546076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.552063 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.555168 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.578870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579119 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579189 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579344 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.579515 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.582038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.583537 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.583995 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.584903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.585174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589531 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589794 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.589939 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.611278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"controller-manager-5db74bc9fd-wp4mj\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.621077 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"route-controller-manager-589b99697b-vh78j\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.629100 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:24 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:24 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:24 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.629168 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682481 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682968 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.682999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.683102 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.685976 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.686842 4755 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.686918 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.735988 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-bckdl\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.745661 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.784660 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.785066 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.785174 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.815603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"redhat-operators-lkvql\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.852018 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.852372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.854961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerStarted","Data":"bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878104 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.878853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerStarted","Data":"3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.886441 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.928067 4755 generic.go:334] "Generic (PLEG): container finished" podID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerID="c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.928431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerDied","Data":"c665d16b7a10e01bc81b15b75ff9e9a77ed47b241ec39af37be788d1bbbe03df"} Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.951545 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.979530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.979855 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.980026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:24 crc kubenswrapper[4755]: I0320 13:34:24.990406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.008819 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.010224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.018830 4755 patch_prober.go:28] interesting pod/console-f9d7485db-rb5zn container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.020214 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rb5zn" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" probeResult="failure" output="Get \"https://10.217.0.30:8443/health\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.020929 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"882b5ec0574afffeb68d87ce1070f73838a10020e187a200742440e96eb7d45d"} Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.021086 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" event={"ID":"da3395eb-3396-4bbc-8a18-3d57519c4667","Type":"ContainerStarted","Data":"06d905723e36898a8fe0fb2068c3547b72ddb1a94c670b4c5b7f5f3d14d9b16e"} Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.043708 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.069217 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-hpj2j" podStartSLOduration=13.069187405 podStartE2EDuration="13.069187405s" podCreationTimestamp="2026-03-20 13:34:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:25.05623673 +0000 UTC m=+244.654169249" watchObservedRunningTime="2026-03-20 13:34:25.069187405 +0000 UTC m=+244.667119934" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.093636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.094071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.094292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.128720 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196877 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.196929 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.197562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.198466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: W0320 13:34:25.210081 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod798ec963_27eb_429b_8cbd_310fbf41feb2.slice/crio-29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28 WatchSource:0}: Error finding container 29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28: Status 404 returned error can't find the container with id 29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28 Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.229872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"redhat-operators-lkkl5\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.233636 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54304: no serving certificate available for the kubelet" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.275900 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c8ae1f-e5a9-4ac8-8af7-2169378af3d2" path="/var/lib/kubelet/pods/53c8ae1f-e5a9-4ac8-8af7-2169378af3d2/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.277516 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.278401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0eef306-2a08-40d1-82cf-ad6d81923c67" path="/var/lib/kubelet/pods/d0eef306-2a08-40d1-82cf-ad6d81923c67/volumes" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.329620 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.388775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.476452 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.482415 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.482532 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.484697 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.484747 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.507038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.512522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.512609 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.521215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:25 crc kubenswrapper[4755]: W0320 13:34:25.608634 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408c6869_42d8_4cbc_a261_57fb45f0d666.slice/crio-500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6 WatchSource:0}: Error finding container 500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6: Status 404 returned error can't find the container with id 500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6 Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.619677 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.620251 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.620546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.631563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.637278 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:25 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:25 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:25 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.637347 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.641030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.823585 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:25 crc kubenswrapper[4755]: I0320 13:34:25.983209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.076872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerStarted","Data":"a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.112419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerStarted","Data":"500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.141338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerStarted","Data":"67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.141410 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerStarted","Data":"29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.142702 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.170019 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.179009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"ea85ece18daec304b9cecefa9ca55b3c7ddbfc128e021ebc4bfd2b1a692b4346"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.201923 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podStartSLOduration=4.201905245 podStartE2EDuration="4.201905245s" podCreationTimestamp="2026-03-20 13:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:26.200128461 +0000 UTC m=+245.798060990" watchObservedRunningTime="2026-03-20 13:34:26.201905245 +0000 UTC m=+245.799837774" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.214974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerStarted","Data":"1f5e663be2a59aec380b38807a322f3980150870b6f5114ac3d543094b13a3ea"} Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.278420 4755 ???:1] "http: TLS handshake error from 192.168.126.11:54306: no serving certificate available for the kubelet" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.340633 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:34:26 crc kubenswrapper[4755]: W0320 13:34:26.378974 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod90e64094_373f_4a6b_ad6d_a68096ece17d.slice/crio-484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768 WatchSource:0}: Error finding container 484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768: Status 404 returned error can't find the container with id 484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768 Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630095 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630555 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:26 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:26 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:26 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.630589 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.733630 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.745831 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.745982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.746056 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") pod \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\" (UID: \"4b90540c-9ef1-478a-a7a1-48817d0c63d0\") " Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.748028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.760722 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.762552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz" (OuterVolumeSpecName: "kube-api-access-mm6mz") pod "4b90540c-9ef1-478a-a7a1-48817d0c63d0" (UID: "4b90540c-9ef1-478a-a7a1-48817d0c63d0"). InnerVolumeSpecName "kube-api-access-mm6mz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.847994 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4b90540c-9ef1-478a-a7a1-48817d0c63d0-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.848028 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mm6mz\" (UniqueName: \"kubernetes.io/projected/4b90540c-9ef1-478a-a7a1-48817d0c63d0-kube-api-access-mm6mz\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4755]: I0320 13:34:26.848050 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4b90540c-9ef1-478a-a7a1-48817d0c63d0-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.152495 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.156154 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.187156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/37d1e037-c169-4932-9928-f3d23ff47c07-metrics-certs\") pod \"network-metrics-daemon-kpm42\" (UID: \"37d1e037-c169-4932-9928-f3d23ff47c07\") " pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268081 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268122 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerStarted","Data":"484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.268162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerStarted","Data":"f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.270976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerStarted","Data":"a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.275013 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerStarted","Data":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.275846 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.279959 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.280056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-k7h8h" event={"ID":"4b90540c-9ef1-478a-a7a1-48817d0c63d0","Type":"ContainerDied","Data":"d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.280130 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e68c21eae2a81473ad2f98b043f6ebfc39f654563a8ab7f0d8ef841d7aef65" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.282188 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podStartSLOduration=5.282168217 podStartE2EDuration="5.282168217s" podCreationTimestamp="2026-03-20 13:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.280379493 +0000 UTC m=+246.878312022" watchObservedRunningTime="2026-03-20 13:34:27.282168217 +0000 UTC m=+246.880100746" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294347 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294500 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c" exitCode=0 Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.294717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"e8b20b8055283611079980efbe691798926e8e8d967c03c6c2d40a174aa03339"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.304747 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" exitCode=0 Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.305069 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad"} Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.305718 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.307327 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.314581 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-kpm42" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.314599 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" podStartSLOduration=175.314566075 podStartE2EDuration="2m55.314566075s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.308926303 +0000 UTC m=+246.906858852" watchObservedRunningTime="2026-03-20 13:34:27.314566075 +0000 UTC m=+246.912498604" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.336291 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=4.336263385 podStartE2EDuration="4.336263385s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:27.330427257 +0000 UTC m=+246.928359786" watchObservedRunningTime="2026-03-20 13:34:27.336263385 +0000 UTC m=+246.934195914" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.632028 4755 patch_prober.go:28] interesting pod/router-default-5444994796-r48mq container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:34:27 crc kubenswrapper[4755]: [-]has-synced failed: reason withheld Mar 20 13:34:27 crc kubenswrapper[4755]: [+]process-running ok Mar 20 13:34:27 crc kubenswrapper[4755]: healthz check failed Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.632486 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-r48mq" podUID="8b65e162-155e-4d40-ab1a-e3560b29f19f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:34:27 crc kubenswrapper[4755]: I0320 13:34:27.962364 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-kpm42"] Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.329938 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerStarted","Data":"632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.340677 4755 generic.go:334] "Generic (PLEG): container finished" podID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerID="a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597" exitCode=0 Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.340871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerDied","Data":"a38ecde488458ddf6ff6ad80f05c544025f8f6c6547c5ca4206fb91093240597"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.351241 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.351213408 podStartE2EDuration="3.351213408s" podCreationTimestamp="2026-03-20 13:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:28.347192405 +0000 UTC m=+247.945124954" watchObservedRunningTime="2026-03-20 13:34:28.351213408 +0000 UTC m=+247.949145947" Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.370481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"fa60a902afe6f653701116f8903fb4077b7003ddd2d261f9fa3d07443b01a9b4"} Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.629626 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:28 crc kubenswrapper[4755]: I0320 13:34:28.635087 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-r48mq" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.382562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerDied","Data":"632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c"} Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.392040 4755 generic.go:334] "Generic (PLEG): container finished" podID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerID="632a9a4fada7a8c98e195a0e2170860a1efb7d0f8bbf52f718f07ac386a75e8c" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.403122 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"20415f7e779e5387c6e3dd577beaa0b2a4b0b1363e9385777ac24a07168446d6"} Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.829058 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.936924 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") pod \"b86b19d6-a389-4b02-b514-f828f685b7fc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.936992 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") pod \"b86b19d6-a389-4b02-b514-f828f685b7fc\" (UID: \"b86b19d6-a389-4b02-b514-f828f685b7fc\") " Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.937124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b86b19d6-a389-4b02-b514-f828f685b7fc" (UID: "b86b19d6-a389-4b02-b514-f828f685b7fc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.937436 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b86b19d6-a389-4b02-b514-f828f685b7fc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4755]: I0320 13:34:29.980355 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b86b19d6-a389-4b02-b514-f828f685b7fc" (UID: "b86b19d6-a389-4b02-b514-f828f685b7fc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.039140 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b86b19d6-a389-4b02-b514-f828f685b7fc-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"b86b19d6-a389-4b02-b514-f828f685b7fc","Type":"ContainerDied","Data":"a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855"} Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427563 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.427579 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61328313f937859da2cf48f31d75e9a9fe762c184fc7b19d3a3054d5f888855" Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.432625 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-kpm42" event={"ID":"37d1e037-c169-4932-9928-f3d23ff47c07","Type":"ContainerStarted","Data":"c573931a211d01b148c78ade59f393c00ec55ea160bfcd9a82c8214167e55ae2"} Mar 20 13:34:30 crc kubenswrapper[4755]: I0320 13:34:30.455699 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-kpm42" podStartSLOduration=178.455670554 podStartE2EDuration="2m58.455670554s" podCreationTimestamp="2026-03-20 13:31:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:30.44635397 +0000 UTC m=+250.044286499" watchObservedRunningTime="2026-03-20 13:34:30.455670554 +0000 UTC m=+250.053603083" Mar 20 13:34:31 crc kubenswrapper[4755]: I0320 13:34:31.098426 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2rw7x" Mar 20 13:34:34 crc kubenswrapper[4755]: I0320 13:34:34.032196 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-l4v7x" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.026178 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.030079 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:34:35 crc kubenswrapper[4755]: I0320 13:34:35.512864 4755 ???:1] "http: TLS handshake error from 192.168.126.11:49074: no serving certificate available for the kubelet" Mar 20 13:34:36 crc kubenswrapper[4755]: I0320 13:34:36.751419 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:36 crc kubenswrapper[4755]: I0320 13:34:36.751560 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.093238 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165287 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") pod \"90e64094-373f-4a6b-ad6d-a68096ece17d\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165411 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "90e64094-373f-4a6b-ad6d-a68096ece17d" (UID: "90e64094-373f-4a6b-ad6d-a68096ece17d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.165568 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") pod \"90e64094-373f-4a6b-ad6d-a68096ece17d\" (UID: \"90e64094-373f-4a6b-ad6d-a68096ece17d\") " Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.168002 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/90e64094-373f-4a6b-ad6d-a68096ece17d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.181224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "90e64094-373f-4a6b-ad6d-a68096ece17d" (UID: "90e64094-373f-4a6b-ad6d-a68096ece17d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.271162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/90e64094-373f-4a6b-ad6d-a68096ece17d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"90e64094-373f-4a6b-ad6d-a68096ece17d","Type":"ContainerDied","Data":"484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768"} Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571255 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="484643bd71145ab3067eaafe2b0793f837385297014701a88b0c3c88e2238768" Mar 20 13:34:41 crc kubenswrapper[4755]: I0320 13:34:41.571325 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.042390 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.042665 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" containerID="cri-o://f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" gracePeriod=30 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.063326 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.063642 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" containerID="cri-o://67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" gracePeriod=30 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.583336 4755 generic.go:334] "Generic (PLEG): container finished" podID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerID="67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" exitCode=0 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.583444 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerDied","Data":"67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327"} Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.585789 4755 generic.go:334] "Generic (PLEG): container finished" podID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerID="f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" exitCode=0 Mar 20 13:34:42 crc kubenswrapper[4755]: I0320 13:34:42.585840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerDied","Data":"f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f"} Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.686592 4755 patch_prober.go:28] interesting pod/route-controller-manager-589b99697b-vh78j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.687277 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.747376 4755 patch_prober.go:28] interesting pod/controller-manager-5db74bc9fd-wp4mj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" start-of-body= Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.747933 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": dial tcp 10.217.0.54:8443: connect: connection refused" Mar 20 13:34:44 crc kubenswrapper[4755]: I0320 13:34:44.997433 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.685944 4755 patch_prober.go:28] interesting pod/route-controller-manager-589b99697b-vh78j container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.687129 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.746619 4755 patch_prober.go:28] interesting pod/controller-manager-5db74bc9fd-wp4mj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.746807 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:34:55 crc kubenswrapper[4755]: I0320 13:34:55.888575 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-22vns" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.123397 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage136669660/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.123639 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hgvrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lkkl5_openshift-marketplace(e9ec78bf-3afe-49d9-983a-99645840cecb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage136669660/2\": happened during read: context canceled" logger="UnhandledError" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.124983 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage136669660/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.599744 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600353 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600371 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600383 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600391 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: E0320 13:34:56.600406 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600411 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600521 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b90540c-9ef1-478a-a7a1-48817d0c63d0" containerName="collect-profiles" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600533 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e64094-373f-4a6b-ad6d-a68096ece17d" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600543 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b86b19d6-a389-4b02-b514-f828f685b7fc" containerName="pruner" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.600997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.605278 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.605736 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.610231 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.638415 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.638482 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.739678 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.739767 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.740628 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.763272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:56 crc kubenswrapper[4755]: I0320 13:34:56.947487 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.047419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.090227 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.090834 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:34:57 crc kubenswrapper[4755]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:34:57 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xqgfr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566892-xh9lg_openshift-infra(28deea0d-d80e-422b-a0c2-40670570aa68): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:34:57 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.092390 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.102615 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.109369 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.110406 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.110512 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:34:57 crc kubenswrapper[4755]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:34:57 crc kubenswrapper[4755]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7bc8c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566894-tzlc5_openshift-infra(d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:34:57 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.111776 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141061 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.141374 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141389 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.141407 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141416 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" containerName="route-controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.141540 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" containerName="controller-manager" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146355 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146413 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146430 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146511 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146530 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") pod \"798ec963-27eb-429b-8cbd-310fbf41feb2\" (UID: \"798ec963-27eb-429b-8cbd-310fbf41feb2\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.146556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") pod \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\" (UID: \"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec\") " Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151284 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151435 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.151839 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.152867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca" (OuterVolumeSpecName: "client-ca") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.153015 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config" (OuterVolumeSpecName: "config") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.155011 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca" (OuterVolumeSpecName: "client-ca") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.155116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config" (OuterVolumeSpecName: "config") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.166583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.166739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq" (OuterVolumeSpecName: "kube-api-access-jmhjq") pod "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" (UID: "4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec"). InnerVolumeSpecName "kube-api-access-jmhjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.167191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.168674 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m" (OuterVolumeSpecName: "kube-api-access-hj85m") pod "798ec963-27eb-429b-8cbd-310fbf41feb2" (UID: "798ec963-27eb-429b-8cbd-310fbf41feb2"). InnerVolumeSpecName "kube-api-access-hj85m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj85m\" (UniqueName: \"kubernetes.io/projected/798ec963-27eb-429b-8cbd-310fbf41feb2-kube-api-access-hj85m\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248806 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmhjq\" (UniqueName: \"kubernetes.io/projected/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-kube-api-access-jmhjq\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248821 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248833 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248891 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248902 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/798ec963-27eb-429b-8cbd-310fbf41feb2-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248911 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248924 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/798ec963-27eb-429b-8cbd-310fbf41feb2-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.248938 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.350249 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351095 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351223 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.351269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.352521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.352818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.355060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.369276 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"route-controller-manager-7788c68c6d-l8hxg\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.507877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.681726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" event={"ID":"4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec","Type":"ContainerDied","Data":"1f5e663be2a59aec380b38807a322f3980150870b6f5114ac3d543094b13a3ea"} Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.682102 4755 scope.go:117] "RemoveContainer" containerID="f2303460ee52f57d9c0d4ca6937be9655e873cce679cbb9ad4b52adff5ab1d9f" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.681786 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.684145 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.684289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j" event={"ID":"798ec963-27eb-429b-8cbd-310fbf41feb2","Type":"ContainerDied","Data":"29646e23b540b2bd9915ea60c10afb4061bcb8c76c4c926f49204988de496c28"} Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.686042 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" Mar 20 13:34:57 crc kubenswrapper[4755]: E0320 13:34:57.686373 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.709172 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.713090 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-589b99697b-vh78j"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.752740 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:57 crc kubenswrapper[4755]: I0320 13:34:57.754055 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5db74bc9fd-wp4mj"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.239877 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec" path="/var/lib/kubelet/pods/4c5e59d0-9eb8-4b6a-9bc5-8af8c4d0deec/volumes" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.240440 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="798ec963-27eb-429b-8cbd-310fbf41feb2" path="/var/lib/kubelet/pods/798ec963-27eb-429b-8cbd-310fbf41feb2/volumes" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.386527 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.388541 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.391959 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.392560 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393030 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393749 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.393797 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.395533 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.402494 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.404004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.482944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483084 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483132 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.483167 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584194 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584315 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.584390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586149 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586532 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.586544 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.590339 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.601113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"controller-manager-d494d75f7-ckcts\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:34:59 crc kubenswrapper[4755]: I0320 13:34:59.709152 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.853493 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.854280 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qsjvb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-929x7_openshift-marketplace(2d2017d2-f4ee-4056-b350-cc313f3faeaf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:01 crc kubenswrapper[4755]: E0320 13:35:01.856016 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.008620 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.098336 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.388757 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.389988 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.410334 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.426885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.426984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.427171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528087 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528224 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528363 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.528480 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.550827 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"installer-9-crc\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:02 crc kubenswrapper[4755]: I0320 13:35:02.724629 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.636594 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.712566 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.712793 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g2rrc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-lkvql_openshift-marketplace(887fa242-bd5e-40f5-8f6e-a81c6e976322): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:05 crc kubenswrapper[4755]: E0320 13:35:05.714056 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" Mar 20 13:35:06 crc kubenswrapper[4755]: I0320 13:35:06.751687 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:06 crc kubenswrapper[4755]: I0320 13:35:06.751851 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.238079 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.303135 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.303526 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hchb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-shzbw_openshift-marketplace(2db67acd-25db-47a7-80ea-da4065a60e23): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:07 crc kubenswrapper[4755]: E0320 13:35:07.306465 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.687482 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" Mar 20 13:35:08 crc kubenswrapper[4755]: I0320 13:35:08.715372 4755 scope.go:117] "RemoveContainer" containerID="67f1337fb55ae11ac5981e92fa9dae7301bfa3d48870179967661b7a6b8a5327" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.788584 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.788826 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pfxf8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-vm24m_openshift-marketplace(184aa529-45c4-42c9-8eee-04bd18fba718): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.790267 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.809885 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.810285 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nblfx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-nlslg_openshift-marketplace(ce4d5763-1786-4b87-8497-0c65da46f446): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.812869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.829345 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.829558 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8rpkd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-d8rq7_openshift-marketplace(a751ac46-3f89-4d5a-8a23-0bbb3584dfa0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.831227 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.900964 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.901174 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w5qg9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-cgznb_openshift-marketplace(e8e34571-6648-4e5e-b3e9-05f87454e19a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:35:08 crc kubenswrapper[4755]: E0320 13:35:08.902947 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.161195 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.165739 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.170396 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec9201c8_bf96_460a_88c0_d37ed74be3b8.slice/crio-48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593 WatchSource:0}: Error finding container 48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593: Status 404 returned error can't find the container with id 48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593 Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.171321 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb9af4f4b_5318_4e19_948a_c976effb4bde.slice/crio-7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22 WatchSource:0}: Error finding container 7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22: Status 404 returned error can't find the container with id 7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22 Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.239089 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc0668fdb_be01_431d_9cbb_dabae6eb44e1.slice/crio-ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e WatchSource:0}: Error finding container ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e: Status 404 returned error can't find the container with id ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e Mar 20 13:35:09 crc kubenswrapper[4755]: W0320 13:35:09.239514 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1634f58c_17b2_4fbe_b668_c0b386e97ee8.slice/crio-7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53 WatchSource:0}: Error finding container 7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53: Status 404 returned error can't find the container with id 7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.243956 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.244004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.763592 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerStarted","Data":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.764166 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.764183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerStarted","Data":"7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.763771 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" containerID="cri-o://ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" gracePeriod=30 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerStarted","Data":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771682 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerStarted","Data":"48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.771826 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" containerID="cri-o://efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" gracePeriod=30 Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.772222 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.775327 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerStarted","Data":"fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.775375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerStarted","Data":"ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.781292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerStarted","Data":"51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee"} Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.781346 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerStarted","Data":"7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22"} Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.783924 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.784571 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.784711 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.785212 4755 patch_prober.go:28] interesting pod/controller-manager-d494d75f7-ckcts container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": EOF" start-of-body= Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.785293 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": EOF" Mar 20 13:35:09 crc kubenswrapper[4755]: E0320 13:35:09.788213 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.794550 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" podStartSLOduration=27.794523888 podStartE2EDuration="27.794523888s" podCreationTimestamp="2026-03-20 13:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.789426684 +0000 UTC m=+289.387359223" watchObservedRunningTime="2026-03-20 13:35:09.794523888 +0000 UTC m=+289.392456417" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.862234 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=7.862194761 podStartE2EDuration="7.862194761s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.86217891 +0000 UTC m=+289.460111449" watchObservedRunningTime="2026-03-20 13:35:09.862194761 +0000 UTC m=+289.460127290" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.897364 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.897332061 podStartE2EDuration="13.897332061s" podCreationTimestamp="2026-03-20 13:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.892928517 +0000 UTC m=+289.490861046" watchObservedRunningTime="2026-03-20 13:35:09.897332061 +0000 UTC m=+289.495264590" Mar 20 13:35:09 crc kubenswrapper[4755]: I0320 13:35:09.920755 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podStartSLOduration=27.920646471 podStartE2EDuration="27.920646471s" podCreationTimestamp="2026-03-20 13:34:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:09.914579926 +0000 UTC m=+289.512512455" watchObservedRunningTime="2026-03-20 13:35:09.920646471 +0000 UTC m=+289.518579010" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.167258 4755 patch_prober.go:28] interesting pod/route-controller-manager-7788c68c6d-l8hxg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:50880->10.217.0.60:8443: read: connection reset by peer" start-of-body= Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.167331 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:50880->10.217.0.60:8443: read: connection reset by peer" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.295436 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.329449 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.329844 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.329862 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.330002 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerName="controller-manager" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.330555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.340952 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370388 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370513 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370583 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") pod \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\" (UID: \"1634f58c-17b2-4fbe-b668-c0b386e97ee8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370820 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370873 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370907 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370929 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.370969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372078 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372086 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca" (OuterVolumeSpecName: "client-ca") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.372596 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config" (OuterVolumeSpecName: "config") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.378090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.378716 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6" (OuterVolumeSpecName: "kube-api-access-5h5h6") pod "1634f58c-17b2-4fbe-b668-c0b386e97ee8" (UID: "1634f58c-17b2-4fbe-b668-c0b386e97ee8"). InnerVolumeSpecName "kube-api-access-5h5h6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.446772 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7788c68c6d-l8hxg_ec9201c8-bf96-460a-88c0-d37ed74be3b8/route-controller-manager/0.log" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.446870 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472216 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472309 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") pod \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\" (UID: \"ec9201c8-bf96-460a-88c0-d37ed74be3b8\") " Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472808 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472834 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472898 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.472989 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1634f58c-17b2-4fbe-b668-c0b386e97ee8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473004 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473018 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5h5h6\" (UniqueName: \"kubernetes.io/projected/1634f58c-17b2-4fbe-b668-c0b386e97ee8-kube-api-access-5h5h6\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473032 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.473044 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1634f58c-17b2-4fbe-b668-c0b386e97ee8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.474274 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config" (OuterVolumeSpecName: "config") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.474347 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca" (OuterVolumeSpecName: "client-ca") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.475613 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.477056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.477850 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx" (OuterVolumeSpecName: "kube-api-access-lrdlx") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "kube-api-access-lrdlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.479005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.481474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.482515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ec9201c8-bf96-460a-88c0-d37ed74be3b8" (UID: "ec9201c8-bf96-460a-88c0-d37ed74be3b8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.497244 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"controller-manager-66bd658874-r7m4x\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574314 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrdlx\" (UniqueName: \"kubernetes.io/projected/ec9201c8-bf96-460a-88c0-d37ed74be3b8-kube-api-access-lrdlx\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574355 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec9201c8-bf96-460a-88c0-d37ed74be3b8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574366 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.574376 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec9201c8-bf96-460a-88c0-d37ed74be3b8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.653916 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.719359 4755 csr.go:261] certificate signing request csr-8z59g is approved, waiting to be issued Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.739838 4755 csr.go:257] certificate signing request csr-8z59g is issued Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.789397 4755 generic.go:334] "Generic (PLEG): container finished" podID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" exitCode=0 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.789530 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790163 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerDied","Data":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-d494d75f7-ckcts" event={"ID":"1634f58c-17b2-4fbe-b668-c0b386e97ee8","Type":"ContainerDied","Data":"7531e0cacc644282a30c93147826a7c366db435900c5dc02bd7b47ef861b6f53"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.790315 4755 scope.go:117] "RemoveContainer" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794589 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7788c68c6d-l8hxg_ec9201c8-bf96-460a-88c0-d37ed74be3b8/route-controller-manager/0.log" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794639 4755 generic.go:334] "Generic (PLEG): container finished" podID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" exitCode=255 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerDied","Data":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" event={"ID":"ec9201c8-bf96-460a-88c0-d37ed74be3b8","Type":"ContainerDied","Data":"48e26fa5ee4496a001b301bedcbff7d89734527ef652e962892d6d60dfd43593"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.794902 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.806309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerStarted","Data":"6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.823238 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" podStartSLOduration=18.20812138 podStartE2EDuration="1m10.82321528s" podCreationTimestamp="2026-03-20 13:34:00 +0000 UTC" firstStartedPulling="2026-03-20 13:34:17.36785724 +0000 UTC m=+236.965789769" lastFinishedPulling="2026-03-20 13:35:09.98295115 +0000 UTC m=+289.580883669" observedRunningTime="2026-03-20 13:35:10.821376243 +0000 UTC m=+290.419308772" watchObservedRunningTime="2026-03-20 13:35:10.82321528 +0000 UTC m=+290.421147809" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.825869 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerID="51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee" exitCode=0 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.825926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerDied","Data":"51c67d036c7dc57d3a165d1f76b532bbfeb40e22c709da46b4d599bbc59169ee"} Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.863578 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.869825 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7788c68c6d-l8hxg"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.871828 4755 scope.go:117] "RemoveContainer" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.872383 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": container with ID starting with ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9 not found: ID does not exist" containerID="ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.872435 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9"} err="failed to get container status \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": rpc error: code = NotFound desc = could not find container \"ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9\": container with ID starting with ebde4485e45fc858aae643880f9b22f19127d63619cc4df384cae4fabb7709c9 not found: ID does not exist" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.872473 4755 scope.go:117] "RemoveContainer" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.873090 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.875756 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-d494d75f7-ckcts"] Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.878180 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:10 crc kubenswrapper[4755]: W0320 13:35:10.885771 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd48c0ce9_15b4_48fc_b6f8_bbd69d45e6bc.slice/crio-9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3 WatchSource:0}: Error finding container 9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3: Status 404 returned error can't find the container with id 9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3 Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.891442 4755 scope.go:117] "RemoveContainer" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: E0320 13:35:10.892394 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": container with ID starting with efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8 not found: ID does not exist" containerID="efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8" Mar 20 13:35:10 crc kubenswrapper[4755]: I0320 13:35:10.892427 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8"} err="failed to get container status \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": rpc error: code = NotFound desc = could not find container \"efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8\": container with ID starting with efcf316df3907d4c7ea0980e74c5504d35b5d78ceb0ed47d3273e66d774e2cf8 not found: ID does not exist" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.234907 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1634f58c-17b2-4fbe-b668-c0b386e97ee8" path="/var/lib/kubelet/pods/1634f58c-17b2-4fbe-b668-c0b386e97ee8/volumes" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.235727 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" path="/var/lib/kubelet/pods/ec9201c8-bf96-460a-88c0-d37ed74be3b8/volumes" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.745362 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-01 13:07:04.189884043 +0000 UTC Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.745426 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6887h31m52.444460957s for next certificate rotation Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.836276 4755 generic.go:334] "Generic (PLEG): container finished" podID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerID="6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9" exitCode=0 Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.836356 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerDied","Data":"6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.840038 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerStarted","Data":"d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.840077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerStarted","Data":"9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3"} Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.841198 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.847403 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:11 crc kubenswrapper[4755]: I0320 13:35:11.871569 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" podStartSLOduration=9.871543989 podStartE2EDuration="9.871543989s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:11.868234178 +0000 UTC m=+291.466166707" watchObservedRunningTime="2026-03-20 13:35:11.871543989 +0000 UTC m=+291.469476518" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.092690 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.096473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") pod \"b9af4f4b-5318-4e19-948a-c976effb4bde\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.096613 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b9af4f4b-5318-4e19-948a-c976effb4bde" (UID: "b9af4f4b-5318-4e19-948a-c976effb4bde"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.097295 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") pod \"b9af4f4b-5318-4e19-948a-c976effb4bde\" (UID: \"b9af4f4b-5318-4e19-948a-c976effb4bde\") " Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.098591 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b9af4f4b-5318-4e19-948a-c976effb4bde-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.108445 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b9af4f4b-5318-4e19-948a-c976effb4bde" (UID: "b9af4f4b-5318-4e19-948a-c976effb4bde"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.200504 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b9af4f4b-5318-4e19-948a-c976effb4bde-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393064 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:12 crc kubenswrapper[4755]: E0320 13:35:12.393317 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393331 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: E0320 13:35:12.393351 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393358 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393463 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9af4f4b-5318-4e19-948a-c976effb4bde" containerName="pruner" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393478 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec9201c8-bf96-460a-88c0-d37ed74be3b8" containerName="route-controller-manager" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.393908 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397005 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397502 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397522 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397904 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.397933 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404717 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.404918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.409443 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506779 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.506864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.508018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.508032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.511069 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.523511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"route-controller-manager-ff8dbcb57-cxhxw\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.717737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.746268 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-03 21:31:18.776131332 +0000 UTC Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.746315 4755 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6943h56m6.029820612s for next certificate rotation Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b9af4f4b-5318-4e19-948a-c976effb4bde","Type":"ContainerDied","Data":"7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22"} Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875792 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e71f78a0036760969cb2bf2112d30050726ab03a919f5fd0dd1c64266627c22" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.875499 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:35:12 crc kubenswrapper[4755]: I0320 13:35:12.982553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:13 crc kubenswrapper[4755]: W0320 13:35:13.001531 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dedcffe_6047_46bb_9970_eed6e7dfcd2a.slice/crio-f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa WatchSource:0}: Error finding container f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa: Status 404 returned error can't find the container with id f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.315288 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.445607 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") pod \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\" (UID: \"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34\") " Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.452381 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c" (OuterVolumeSpecName: "kube-api-access-7bc8c") pod "d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" (UID: "d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34"). InnerVolumeSpecName "kube-api-access-7bc8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.546998 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bc8c\" (UniqueName: \"kubernetes.io/projected/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34-kube-api-access-7bc8c\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.883840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerStarted","Data":"c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.883913 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerStarted","Data":"f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.884122 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.887357 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerStarted","Data":"5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.890027 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.891442 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892355 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-tzlc5" event={"ID":"d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34","Type":"ContainerDied","Data":"87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de"} Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.892394 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87e583e770b84390d24444ba39d071b7a79cf80b1ef8556c747221568f1b50de" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.922355 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podStartSLOduration=11.922330509 podStartE2EDuration="11.922330509s" podCreationTimestamp="2026-03-20 13:35:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:13.90529192 +0000 UTC m=+293.503224449" watchObservedRunningTime="2026-03-20 13:35:13.922330509 +0000 UTC m=+293.520263038" Mar 20 13:35:13 crc kubenswrapper[4755]: I0320 13:35:13.924300 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" podStartSLOduration=138.055234373 podStartE2EDuration="3m13.92429387s" podCreationTimestamp="2026-03-20 13:32:00 +0000 UTC" firstStartedPulling="2026-03-20 13:34:17.441558616 +0000 UTC m=+237.039491145" lastFinishedPulling="2026-03-20 13:35:13.310618113 +0000 UTC m=+292.908550642" observedRunningTime="2026-03-20 13:35:13.922490044 +0000 UTC m=+293.520422563" watchObservedRunningTime="2026-03-20 13:35:13.92429387 +0000 UTC m=+293.522226389" Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.900921 4755 generic.go:334] "Generic (PLEG): container finished" podID="28deea0d-d80e-422b-a0c2-40670570aa68" containerID="5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a" exitCode=0 Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.901024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerDied","Data":"5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a"} Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.905127 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a" exitCode=0 Mar 20 13:35:14 crc kubenswrapper[4755]: I0320 13:35:14.905445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a"} Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.193887 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.294083 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") pod \"28deea0d-d80e-422b-a0c2-40670570aa68\" (UID: \"28deea0d-d80e-422b-a0c2-40670570aa68\") " Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.301561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr" (OuterVolumeSpecName: "kube-api-access-xqgfr") pod "28deea0d-d80e-422b-a0c2-40670570aa68" (UID: "28deea0d-d80e-422b-a0c2-40670570aa68"). InnerVolumeSpecName "kube-api-access-xqgfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.395937 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqgfr\" (UniqueName: \"kubernetes.io/projected/28deea0d-d80e-422b-a0c2-40670570aa68-kube-api-access-xqgfr\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.918737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" event={"ID":"28deea0d-d80e-422b-a0c2-40670570aa68","Type":"ContainerDied","Data":"d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4"} Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.919621 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1a591dd18b3c1bd59ffa816236e003ecd9f7f13017f4edb2ba58c108b15d7f4" Mar 20 13:35:16 crc kubenswrapper[4755]: I0320 13:35:16.918820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xh9lg" Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.031028 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.032827 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" containerID="cri-o://d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" gracePeriod=30 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.058276 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.058549 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" containerID="cri-o://c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" gracePeriod=30 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.719541 4755 patch_prober.go:28] interesting pod/route-controller-manager-ff8dbcb57-cxhxw container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" start-of-body= Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.719643 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.64:8443/healthz\": dial tcp 10.217.0.64:8443: connect: connection refused" Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.968442 4755 generic.go:334] "Generic (PLEG): container finished" podID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerID="c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" exitCode=0 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.968611 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerDied","Data":"c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a"} Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.970753 4755 generic.go:334] "Generic (PLEG): container finished" podID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerID="d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" exitCode=0 Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.970810 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerDied","Data":"d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45"} Mar 20 13:35:22 crc kubenswrapper[4755]: I0320 13:35:22.982709 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerStarted","Data":"c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09"} Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.009861 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lkkl5" podStartSLOduration=4.596072378 podStartE2EDuration="59.009834337s" podCreationTimestamp="2026-03-20 13:34:24 +0000 UTC" firstStartedPulling="2026-03-20 13:34:27.304070845 +0000 UTC m=+246.902003374" lastFinishedPulling="2026-03-20 13:35:21.717832804 +0000 UTC m=+301.315765333" observedRunningTime="2026-03-20 13:35:23.006929138 +0000 UTC m=+302.604861697" watchObservedRunningTime="2026-03-20 13:35:23.009834337 +0000 UTC m=+302.607766866" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.908649 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919956 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.919999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.920111 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.920141 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") pod \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\" (UID: \"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc\") " Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.921962 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.922955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.923525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config" (OuterVolumeSpecName: "config") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.937401 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv" (OuterVolumeSpecName: "kube-api-access-7fsjv") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "kube-api-access-7fsjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.952873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" (UID: "d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962523 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.962839 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962856 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.962869 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.962877 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: E0320 13:35:23.963644 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963681 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963807 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963827 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" containerName="controller-manager" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.963841 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" containerName="oc" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.964263 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.964369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" event={"ID":"d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc","Type":"ContainerDied","Data":"9d886dad375acd64a80cd69d4d439136d067d9ca5168b181af380604ec9b81f3"} Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999307 4755 scope.go:117] "RemoveContainer" containerID="d22e5235e0b7fcba5f514b736107cceaeba1d75cc18f0f32d6b86a3f8d648a45" Mar 20 13:35:23 crc kubenswrapper[4755]: I0320 13:35:23.999529 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66bd658874-r7m4x" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.019329 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021560 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021623 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021687 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021701 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021713 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fsjv\" (UniqueName: \"kubernetes.io/projected/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-kube-api-access-7fsjv\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021724 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.021733 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.072258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.077615 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66bd658874-r7m4x"] Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122297 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.122774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") pod \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\" (UID: \"2dedcffe-6047-46bb-9970-eed6e7dfcd2a\") " Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca" (OuterVolumeSpecName: "client-ca") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124420 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config" (OuterVolumeSpecName: "config") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124644 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124754 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.124871 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.125013 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.125039 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.126347 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.126612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.127585 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.130379 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc" (OuterVolumeSpecName: "kube-api-access-2vckc") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "kube-api-access-2vckc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.131834 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2dedcffe-6047-46bb-9970-eed6e7dfcd2a" (UID: "2dedcffe-6047-46bb-9970-eed6e7dfcd2a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.133117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.147581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"controller-manager-55c45b749f-kr7m7\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.225900 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.225944 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vckc\" (UniqueName: \"kubernetes.io/projected/2dedcffe-6047-46bb-9970-eed6e7dfcd2a-kube-api-access-2vckc\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.315296 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:24 crc kubenswrapper[4755]: I0320 13:35:24.749941 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.010483 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.010608 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.015352 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.015379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.018001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerStarted","Data":"dc837d19aecb0eeba01a794632cec0b82547d105d101892ba220de3315f6f008"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021534 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" event={"ID":"2dedcffe-6047-46bb-9970-eed6e7dfcd2a","Type":"ContainerDied","Data":"f45944160414689b0d3f59e85db10e4259a31556447c446575a4dcfa742ff6aa"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021583 4755 scope.go:117] "RemoveContainer" containerID="c83605f3f28595d6e18891309036bda3562e2992511ee20801ce02e207428a8a" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.021740 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.031271 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" exitCode=0 Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.031341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de"} Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.124672 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.127532 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-ff8dbcb57-cxhxw"] Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.234387 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" path="/var/lib/kubelet/pods/2dedcffe-6047-46bb-9970-eed6e7dfcd2a/volumes" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.235450 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc" path="/var/lib/kubelet/pods/d48c0ce9-15b4-48fc-b6f8-bbd69d45e6bc/volumes" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.331292 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:25 crc kubenswrapper[4755]: I0320 13:35:25.331539 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.042673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerStarted","Data":"dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c"} Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.045005 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.049582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.068010 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" podStartSLOduration=4.067987228 podStartE2EDuration="4.067987228s" podCreationTimestamp="2026-03-20 13:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:26.063631326 +0000 UTC m=+305.661563895" watchObservedRunningTime="2026-03-20 13:35:26.067987228 +0000 UTC m=+305.665919797" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411020 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:26 crc kubenswrapper[4755]: E0320 13:35:26.411319 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411337 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.411471 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dedcffe-6047-46bb-9970-eed6e7dfcd2a" containerName="route-controller-manager" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.412001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416279 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416407 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416443 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.416536 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.417449 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.418084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.433894 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.463252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.563825 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.565061 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.565309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.571717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.581026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"route-controller-manager-f8895797d-ss26f\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.655043 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" probeResult="failure" output=< Mar 20 13:35:26 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:35:26 crc kubenswrapper[4755]: > Mar 20 13:35:26 crc kubenswrapper[4755]: I0320 13:35:26.781026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:27 crc kubenswrapper[4755]: I0320 13:35:27.051780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.060407 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" exitCode=0 Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.060545 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} Mar 20 13:35:28 crc kubenswrapper[4755]: I0320 13:35:28.858381 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:29 crc kubenswrapper[4755]: I0320 13:35:29.070726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerStarted","Data":"3132c3da4cf259af8b45987aa5d639d9329cf6c2181feff72111c4f02f43d042"} Mar 20 13:35:33 crc kubenswrapper[4755]: I0320 13:35:33.108040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerStarted","Data":"b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9"} Mar 20 13:35:33 crc kubenswrapper[4755]: I0320 13:35:33.112835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerStarted","Data":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.119307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.125518 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.145736 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" podStartSLOduration=12.145711821 podStartE2EDuration="12.145711821s" podCreationTimestamp="2026-03-20 13:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:34.145238987 +0000 UTC m=+313.743171546" watchObservedRunningTime="2026-03-20 13:35:34.145711821 +0000 UTC m=+313.743644390" Mar 20 13:35:34 crc kubenswrapper[4755]: I0320 13:35:34.177139 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-shzbw" podStartSLOduration=8.446711804 podStartE2EDuration="1m13.177103517s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.719792453 +0000 UTC m=+243.317724982" lastFinishedPulling="2026-03-20 13:35:28.450184156 +0000 UTC m=+308.048116695" observedRunningTime="2026-03-20 13:35:34.17127894 +0000 UTC m=+313.769211489" watchObservedRunningTime="2026-03-20 13:35:34.177103517 +0000 UTC m=+313.775036056" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.441162 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.514543 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:35 crc kubenswrapper[4755]: I0320 13:35:35.690388 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.136252 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerStarted","Data":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.138099 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" exitCode=0 Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.138955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19"} Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751805 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751904 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.751966 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.752626 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:36 crc kubenswrapper[4755]: I0320 13:35:36.752715 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" gracePeriod=600 Mar 20 13:35:37 crc kubenswrapper[4755]: I0320 13:35:37.153876 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lkkl5" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" containerID="cri-o://c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" gracePeriod=2 Mar 20 13:35:37 crc kubenswrapper[4755]: I0320 13:35:37.215027 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-929x7" podStartSLOduration=8.438647583 podStartE2EDuration="1m14.215005203s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="2026-03-20 13:34:24.916687919 +0000 UTC m=+244.514620448" lastFinishedPulling="2026-03-20 13:35:30.693045489 +0000 UTC m=+310.290978068" observedRunningTime="2026-03-20 13:35:37.211254488 +0000 UTC m=+316.809187027" watchObservedRunningTime="2026-03-20 13:35:37.215005203 +0000 UTC m=+316.812937742" Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.162819 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" exitCode=0 Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.162934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280"} Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.166261 4755 generic.go:334] "Generic (PLEG): container finished" podID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerID="c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" exitCode=0 Mar 20 13:35:38 crc kubenswrapper[4755]: I0320 13:35:38.166290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.176127 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerStarted","Data":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.179562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.626013 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.786899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.787133 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.787208 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") pod \"e9ec78bf-3afe-49d9-983a-99645840cecb\" (UID: \"e9ec78bf-3afe-49d9-983a-99645840cecb\") " Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.788064 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities" (OuterVolumeSpecName: "utilities") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.794766 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm" (OuterVolumeSpecName: "kube-api-access-hgvrm") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "kube-api-access-hgvrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.888710 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgvrm\" (UniqueName: \"kubernetes.io/projected/e9ec78bf-3afe-49d9-983a-99645840cecb-kube-api-access-hgvrm\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.888748 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.937232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9ec78bf-3afe-49d9-983a-99645840cecb" (UID: "e9ec78bf-3afe-49d9-983a-99645840cecb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:39 crc kubenswrapper[4755]: I0320 13:35:39.990081 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9ec78bf-3afe-49d9-983a-99645840cecb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkkl5" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkkl5" event={"ID":"e9ec78bf-3afe-49d9-983a-99645840cecb","Type":"ContainerDied","Data":"e8b20b8055283611079980efbe691798926e8e8d967c03c6c2d40a174aa03339"} Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.197911 4755 scope.go:117] "RemoveContainer" containerID="c537535ed1ecbb2ac03077b2ffdf6626d2f4f9df45c7d2b0f60c4ba044891b09" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.205584 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" exitCode=0 Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.205702 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.232681 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vm24m" podStartSLOduration=5.21447221 podStartE2EDuration="1m19.23262184s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.779001758 +0000 UTC m=+243.376934287" lastFinishedPulling="2026-03-20 13:35:37.797151338 +0000 UTC m=+317.395083917" observedRunningTime="2026-03-20 13:35:40.227920286 +0000 UTC m=+319.825852855" watchObservedRunningTime="2026-03-20 13:35:40.23262184 +0000 UTC m=+319.830554399" Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.274623 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:40 crc kubenswrapper[4755]: I0320 13:35:40.279200 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lkkl5"] Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.233000 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" path="/var/lib/kubelet/pods/e9ec78bf-3afe-49d9-983a-99645840cecb/volumes" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.303957 4755 scope.go:117] "RemoveContainer" containerID="ce57e7ac1f12e13395d7f1206b4c545ce2cbc6ea5e1fd28c43be80858e51117a" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.504826 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.504902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.568595 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:41 crc kubenswrapper[4755]: I0320 13:35:41.672300 4755 scope.go:117] "RemoveContainer" containerID="3af5416fdb7aa4f016f347ab29ea0d465dc03414f176f3ce6611a45e7044555c" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.040543 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.040869 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" containerID="cri-o://dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" gracePeriod=30 Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.132138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.132380 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" containerID="cri-o://b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" gracePeriod=30 Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.150563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.150756 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.206150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:42 crc kubenswrapper[4755]: I0320 13:35:42.268970 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.242914 4755 generic.go:334] "Generic (PLEG): container finished" podID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerID="b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" exitCode=0 Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.243049 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerDied","Data":"b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9"} Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.247909 4755 generic.go:334] "Generic (PLEG): container finished" podID="400323f5-babd-4943-a66e-515ee2b59889" containerID="dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" exitCode=0 Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.248137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerDied","Data":"dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c"} Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.488617 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.488718 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.551099 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.799910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.844075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.857572 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858103 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858124 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858155 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858186 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858194 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858207 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-utilities" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858238 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-utilities" Mar 20 13:35:43 crc kubenswrapper[4755]: E0320 13:35:43.858257 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-content" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858264 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="extract-content" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858405 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9ec78bf-3afe-49d9-983a-99645840cecb" containerName="registry-server" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858421 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" containerName="route-controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.858433 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="400323f5-babd-4943-a66e-515ee2b59889" containerName="controller-manager" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.863497 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.863610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876638 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876894 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.876966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877110 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877144 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877275 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") pod \"af80897c-3c59-4376-b0f0-15d862d4b7d5\" (UID: \"af80897c-3c59-4376-b0f0-15d862d4b7d5\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877313 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877364 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") pod \"400323f5-babd-4943-a66e-515ee2b59889\" (UID: \"400323f5-babd-4943-a66e-515ee2b59889\") " Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.878196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca" (OuterVolumeSpecName: "client-ca") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.877775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.879188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.879816 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.880124 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.880795 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.881909 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config" (OuterVolumeSpecName: "config") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.886467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.886865 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config" (OuterVolumeSpecName: "config") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.895729 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.895860 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz" (OuterVolumeSpecName: "kube-api-access-8d5qz") pod "400323f5-babd-4943-a66e-515ee2b59889" (UID: "400323f5-babd-4943-a66e-515ee2b59889"). InnerVolumeSpecName "kube-api-access-8d5qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.899843 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9" (OuterVolumeSpecName: "kube-api-access-d68f9") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "kube-api-access-d68f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.901710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "af80897c-3c59-4376-b0f0-15d862d4b7d5" (UID: "af80897c-3c59-4376-b0f0-15d862d4b7d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.981849 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982797 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.982975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983087 4755 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983165 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/400323f5-babd-4943-a66e-515ee2b59889-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983239 4755 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983307 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af80897c-3c59-4376-b0f0-15d862d4b7d5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983372 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af80897c-3c59-4376-b0f0-15d862d4b7d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983459 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d5qz\" (UniqueName: \"kubernetes.io/projected/400323f5-babd-4943-a66e-515ee2b59889-kube-api-access-8d5qz\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983534 4755 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/400323f5-babd-4943-a66e-515ee2b59889-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.983962 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d68f9\" (UniqueName: \"kubernetes.io/projected/af80897c-3c59-4376-b0f0-15d862d4b7d5-kube-api-access-d68f9\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.987217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-client-ca\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.987914 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2857fcc-84c2-42ab-81d7-5e430db9cfba-config\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:43 crc kubenswrapper[4755]: I0320 13:35:43.994603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2857fcc-84c2-42ab-81d7-5e430db9cfba-serving-cert\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.001002 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqsl\" (UniqueName: \"kubernetes.io/projected/f2857fcc-84c2-42ab-81d7-5e430db9cfba-kube-api-access-mfqsl\") pod \"route-controller-manager-7f67b6877-57b8j\" (UID: \"f2857fcc-84c2-42ab-81d7-5e430db9cfba\") " pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.192481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.267376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerStarted","Data":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.279921 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.295483 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d8rq7" podStartSLOduration=3.429581004 podStartE2EDuration="1m23.295458996s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.799376768 +0000 UTC m=+243.397309297" lastFinishedPulling="2026-03-20 13:35:43.66525476 +0000 UTC m=+323.263187289" observedRunningTime="2026-03-20 13:35:44.293631734 +0000 UTC m=+323.891564293" watchObservedRunningTime="2026-03-20 13:35:44.295458996 +0000 UTC m=+323.893391525" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.302238 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerStarted","Data":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.338494 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" exitCode=0 Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.338607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.341083 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lkvql" podStartSLOduration=6.3537257910000005 podStartE2EDuration="1m20.341059837s" podCreationTimestamp="2026-03-20 13:34:24 +0000 UTC" firstStartedPulling="2026-03-20 13:34:27.316750481 +0000 UTC m=+246.914683010" lastFinishedPulling="2026-03-20 13:35:41.304084487 +0000 UTC m=+320.902017056" observedRunningTime="2026-03-20 13:35:44.340255114 +0000 UTC m=+323.938187653" watchObservedRunningTime="2026-03-20 13:35:44.341059837 +0000 UTC m=+323.938992356" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.349540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerStarted","Data":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351144 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" event={"ID":"af80897c-3c59-4376-b0f0-15d862d4b7d5","Type":"ContainerDied","Data":"3132c3da4cf259af8b45987aa5d639d9329cf6c2181feff72111c4f02f43d042"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351179 4755 scope.go:117] "RemoveContainer" containerID="b7d323c9248f3d65f8f58a82fb7785e022f487239b0ddc588434886b81e035a9" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.351317 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.362546 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.363002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55c45b749f-kr7m7" event={"ID":"400323f5-babd-4943-a66e-515ee2b59889","Type":"ContainerDied","Data":"dc837d19aecb0eeba01a794632cec0b82547d105d101892ba220de3315f6f008"} Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.406292 4755 scope.go:117] "RemoveContainer" containerID="dfc2aade1d617a8ba67e6cbcd0ec853a353d6a4d85771372ecc77e2a5025a58c" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.417791 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.422335 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f8895797d-ss26f"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.425006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.430457 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.454024 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55c45b749f-kr7m7"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.687957 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j"] Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.887750 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:44 crc kubenswrapper[4755]: I0320 13:35:44.887842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.232375 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="400323f5-babd-4943-a66e-515ee2b59889" path="/var/lib/kubelet/pods/400323f5-babd-4943-a66e-515ee2b59889/volumes" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.233569 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af80897c-3c59-4376-b0f0-15d862d4b7d5" path="/var/lib/kubelet/pods/af80897c-3c59-4376-b0f0-15d862d4b7d5/volumes" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.234303 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p"] Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.371880 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" event={"ID":"f2857fcc-84c2-42ab-81d7-5e430db9cfba","Type":"ContainerStarted","Data":"05c817fcb54f8b0544d8f4535a30e90ac957e51d5fddffedb04bb96117c07a28"} Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.371931 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" event={"ID":"f2857fcc-84c2-42ab-81d7-5e430db9cfba","Type":"ContainerStarted","Data":"3187e96982f3615bd579e5fcf8eeb7d0176b994fffb615b11e16a52d9cef80dd"} Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.397774 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cgznb" podStartSLOduration=4.28759628 podStartE2EDuration="1m24.397744067s" podCreationTimestamp="2026-03-20 13:34:21 +0000 UTC" firstStartedPulling="2026-03-20 13:34:23.682288531 +0000 UTC m=+243.280221060" lastFinishedPulling="2026-03-20 13:35:43.792436318 +0000 UTC m=+323.390368847" observedRunningTime="2026-03-20 13:35:45.395082713 +0000 UTC m=+324.993015262" watchObservedRunningTime="2026-03-20 13:35:45.397744067 +0000 UTC m=+324.995676596" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.419668 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" podStartSLOduration=3.419623627 podStartE2EDuration="3.419623627s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:35:45.417045756 +0000 UTC m=+325.014978305" watchObservedRunningTime="2026-03-20 13:35:45.419623627 +0000 UTC m=+325.017556166" Mar 20 13:35:45 crc kubenswrapper[4755]: I0320 13:35:45.951133 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" probeResult="failure" output=< Mar 20 13:35:45 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:35:45 crc kubenswrapper[4755]: > Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.382967 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerStarted","Data":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.384877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.390061 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f67b6877-57b8j" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.405251 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nlslg" podStartSLOduration=2.848870367 podStartE2EDuration="1m23.405221457s" podCreationTimestamp="2026-03-20 13:34:23 +0000 UTC" firstStartedPulling="2026-03-20 13:34:24.867216712 +0000 UTC m=+244.465149241" lastFinishedPulling="2026-03-20 13:35:45.423567802 +0000 UTC m=+325.021500331" observedRunningTime="2026-03-20 13:35:46.403178462 +0000 UTC m=+326.001110991" watchObservedRunningTime="2026-03-20 13:35:46.405221457 +0000 UTC m=+326.003153996" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.424792 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.425599 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428206 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428844 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.428866 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.429746 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.430486 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.430899 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.443083 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.449994 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529678 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.529749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.530811 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.530918 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.632934 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.633064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.633166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.634356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-client-ca\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.634519 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-config\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.635212 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/c9017aa0-1c82-4753-b448-b07556e89259-proxy-ca-bundles\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.645217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c9017aa0-1c82-4753-b448-b07556e89259-serving-cert\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.657949 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-825bm\" (UniqueName: \"kubernetes.io/projected/c9017aa0-1c82-4753-b448-b07556e89259-kube-api-access-825bm\") pod \"controller-manager-59d967769f-g8465\" (UID: \"c9017aa0-1c82-4753-b448-b07556e89259\") " pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:46 crc kubenswrapper[4755]: I0320 13:35:46.742545 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.073055 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59d967769f-g8465"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.255712 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.256752 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.256876 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257541 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257696 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257742 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.257690 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.258129 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" gracePeriod=15 Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.258315 4755 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.292870 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.311804 4755 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312083 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312105 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312118 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312125 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312138 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312145 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312156 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312162 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312169 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312175 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312275 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312282 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312290 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312296 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312307 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312313 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312417 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312427 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312434 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312446 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312453 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312459 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312466 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312474 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312570 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312579 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312697 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.312824 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.312832 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342110 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342190 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342227 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342325 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.342420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: E0320 13:35:47.349699 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.395041 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396193 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396812 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" exitCode=0 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396834 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" exitCode=2 Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.396902 4755 scope.go:117] "RemoveContainer" containerID="88572a24f10bf783741ac50cf12098d5df0313fab673db7b778b64ba9954761e" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.399553 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" event={"ID":"c9017aa0-1c82-4753-b448-b07556e89259","Type":"ContainerStarted","Data":"08f942cf97d3c7a747d20319a863fdb751e92d12dadc8b794d2edd09a8ddba05"} Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.399587 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" event={"ID":"c9017aa0-1c82-4753-b448-b07556e89259","Type":"ContainerStarted","Data":"e83f0d09983e19d308152d03cd48e7dcf807481871cc70e9fa4f974545dbacc3"} Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443253 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443382 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443384 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443434 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443493 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443534 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443572 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443673 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443873 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.444025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.443963 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: I0320 13:35:47.589477 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:35:47 crc kubenswrapper[4755]: W0320 13:35:47.609949 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050 WatchSource:0}: Error finding container 6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050: Status 404 returned error can't find the container with id 6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.411148 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.412311 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.412364 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.415456 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.415525 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6292c1d5d321cc62a1d6c5dfa738cf623b73fd2997de851cf0c324807352c050"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.417503 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.417984 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.420184 4755 generic.go:334] "Generic (PLEG): container finished" podID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerID="fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.420859 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerDied","Data":"fffe9bdab6834124d2b86719fd421bb0588e3380d4e9338e0e67109cffba702d"} Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.421409 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.422068 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.423098 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.423621 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424227 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424621 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.424937 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.425204 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428157 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428454 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.428762 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.429032 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.429307 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.747111 4755 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 13:35:48 crc kubenswrapper[4755]: I0320 13:35:48.747832 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: E0320 13:35:49.374591 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.751140 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.752438 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.752992 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.753385 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.883872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884011 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884081 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") pod \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\" (UID: \"c0668fdb-be01-431d-9cbb-dabae6eb44e1\") " Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884119 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock" (OuterVolumeSpecName: "var-lock") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884255 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884624 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.884641 4755 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.892922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c0668fdb-be01-431d-9cbb-dabae6eb44e1" (UID: "c0668fdb-be01-431d-9cbb-dabae6eb44e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4755]: I0320 13:35:49.986250 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c0668fdb-be01-431d-9cbb-dabae6eb44e1-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c0668fdb-be01-431d-9cbb-dabae6eb44e1","Type":"ContainerDied","Data":"ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e"} Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438194 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac046ff39f3a04fb07b0ff93df73e25dc97a6778cfdb9268954f0e54fa7ee00e" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.438198 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.454431 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.454892 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.455234 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.863556 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.865184 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.865818 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.866281 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.866750 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.867300 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897484 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897576 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897662 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897715 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.897720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898055 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898079 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4755]: I0320 13:35:50.898098 4755 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.229605 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.230234 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.230725 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.231364 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.236639 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.452637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.454850 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" exitCode=0 Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.454954 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.455001 4755 scope.go:117] "RemoveContainer" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.456128 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.457029 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.457804 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.459037 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.460322 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.460855 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.461097 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.461319 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.483039 4755 scope.go:117] "RemoveContainer" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.508150 4755 scope.go:117] "RemoveContainer" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.529025 4755 scope.go:117] "RemoveContainer" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.554737 4755 scope.go:117] "RemoveContainer" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.614053 4755 scope.go:117] "RemoveContainer" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.638190 4755 scope.go:117] "RemoveContainer" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.638925 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": container with ID starting with 40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4 not found: ID does not exist" containerID="40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639020 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4"} err="failed to get container status \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": rpc error: code = NotFound desc = could not find container \"40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4\": container with ID starting with 40d1009769a2af5c1357216f65cd8fb1a83581ea3ab6ba907f3ee509b2e0b7d4 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639070 4755 scope.go:117] "RemoveContainer" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.639512 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": container with ID starting with ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c not found: ID does not exist" containerID="ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639557 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c"} err="failed to get container status \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": rpc error: code = NotFound desc = could not find container \"ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c\": container with ID starting with ef4f53def09cfb3c065652fa9342e0d1b318279a465eb12cc8fcb814f3006b2c not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.639591 4755 scope.go:117] "RemoveContainer" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.639975 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": container with ID starting with 91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462 not found: ID does not exist" containerID="91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640021 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462"} err="failed to get container status \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": rpc error: code = NotFound desc = could not find container \"91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462\": container with ID starting with 91e67c3b1235e69a328986c1e7cebeb522892138cd10bd5538b04dee654e9462 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640053 4755 scope.go:117] "RemoveContainer" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.640632 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": container with ID starting with 280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78 not found: ID does not exist" containerID="280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640706 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78"} err="failed to get container status \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": rpc error: code = NotFound desc = could not find container \"280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78\": container with ID starting with 280031b15f471ce67554d6528a79d6df512b4690249875aba7f76e3a17dd8c78 not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.640731 4755 scope.go:117] "RemoveContainer" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.641077 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": container with ID starting with 26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c not found: ID does not exist" containerID="26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641140 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c"} err="failed to get container status \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": rpc error: code = NotFound desc = could not find container \"26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c\": container with ID starting with 26d4a765fd0eeed10fe112e85eb7ab75e23582d26900534a81e618c38d6a587c not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641180 4755 scope.go:117] "RemoveContainer" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: E0320 13:35:51.641603 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": container with ID starting with 2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e not found: ID does not exist" containerID="2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.641627 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e"} err="failed to get container status \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": rpc error: code = NotFound desc = could not find container \"2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e\": container with ID starting with 2c696375843661b3c7082e7e41b7a75cabe3db7a3c31002e7e22049cdc37a31e not found: ID does not exist" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.798102 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.798168 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.869993 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.871125 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.871737 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.872345 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.873087 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.873687 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.937629 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:51 crc kubenswrapper[4755]: I0320 13:35:51.937725 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.051574 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.052549 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.053693 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.054366 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.054996 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.055519 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.056038 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.219950 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.221350 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.222182 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.222755 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.223428 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.224289 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.224810 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.225400 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.538540 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.539531 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.540364 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.541169 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.541637 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542049 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.542550 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543093 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543586 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.543994 4755 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.544626 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.545351 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.545867 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.546421 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:52 crc kubenswrapper[4755]: I0320 13:35:52.546953 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.715864 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.716425 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717179 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717561 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.717909 4755 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.717953 4755 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.718268 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="200ms" Mar 20 13:35:53 crc kubenswrapper[4755]: E0320 13:35:53.919375 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="400ms" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.955246 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:53 crc kubenswrapper[4755]: I0320 13:35:53.955337 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.016689 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.017790 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018286 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018578 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.018889 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019193 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019424 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.019717 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: E0320 13:35:54.337220 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="800ms" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.571539 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.572591 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.573192 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.573701 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.574175 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.574623 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.575190 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.575771 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.967008 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.968406 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.969573 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.970395 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.971008 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.971605 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.972165 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.972776 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:54 crc kubenswrapper[4755]: I0320 13:35:54.973337 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.054343 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.055084 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.055718 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.056345 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.056633 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057004 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057371 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057565 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: I0320 13:35:55.057772 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:55 crc kubenswrapper[4755]: E0320 13:35:55.139397 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="1.6s" Mar 20 13:35:56 crc kubenswrapper[4755]: E0320 13:35:56.741352 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="3.2s" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.254191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.254642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.255141 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.255349 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.255972 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.256082 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.356063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:35:57 crc kubenswrapper[4755]: I0320 13:35:57.356535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:35:57 crc kubenswrapper[4755]: W0320 13:35:57.357339 4755 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:57 crc kubenswrapper[4755]: E0320 13:35:57.357498 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.255596 4755 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.255632 4755 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.256197 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:00.256155547 +0000 UTC m=+459.854088116 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.256261 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:00.256235629 +0000 UTC m=+459.854168188 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.356912 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.356952 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:58 crc kubenswrapper[4755]: W0320 13:35:58.357751 4755 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.357864 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:58 crc kubenswrapper[4755]: W0320 13:35:58.945827 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:35:58 crc kubenswrapper[4755]: E0320 13:35:58.945922 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.225075 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.225760 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226113 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226434 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.226838 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.227250 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.227772 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.228121 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.228591 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.247601 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.247649 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.248162 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.248698 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:35:59 crc kubenswrapper[4755]: W0320 13:35:59.272815 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df WatchSource:0}: Error finding container f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df: Status 404 returned error can't find the container with id f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358129 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358163 4755 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358190 4755 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358201 4755 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358303 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:01.358270475 +0000 UTC m=+460.956203014 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.358336 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:38:01.358323797 +0000 UTC m=+460.956256406 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.375756 4755 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.181:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-59d967769f-g8465.189e901aa55eec01 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-59d967769f-g8465,UID:c9017aa0-1c82-4753-b448-b07556e89259,APIVersion:v1,ResourceVersion:29964,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,LastTimestamp:2026-03-20 13:35:47.348564993 +0000 UTC m=+326.946497522,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:35:59 crc kubenswrapper[4755]: I0320 13:35:59.521181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f71da798fd1b166d11e8b6f637838b6712ed8c4b78b6e21ef261425a1cead2df"} Mar 20 13:35:59 crc kubenswrapper[4755]: E0320 13:35:59.942828 4755 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.181:6443: connect: connection refused" interval="6.4s" Mar 20 13:36:00 crc kubenswrapper[4755]: W0320 13:36:00.096754 4755 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.096873 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27316\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:36:00 crc kubenswrapper[4755]: W0320 13:36:00.118384 4755 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312": dial tcp 38.102.83.181:6443: connect: connection refused Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.118574 4755 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27312\": dial tcp 38.102.83.181:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.532798 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535303 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535384 4755 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea" exitCode=1 Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.535468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea"} Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.536304 4755 scope.go:117] "RemoveContainer" containerID="285033a56362d0d7c3a110db3b17a09fe642e807df9e94f26537fcd3a9e666ea" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.536862 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.537554 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.538549 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539071 4755 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="65b16727d1e2d3ada1de65c7caf024709ec513abe901df42225ca09d63835f49" exitCode=0 Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"65b16727d1e2d3ada1de65c7caf024709ec513abe901df42225ca09d63835f49"} Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539229 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539420 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539447 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.539882 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: E0320 13:36:00.540112 4755 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.540448 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.540839 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.541275 4755 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.541701 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542208 4755 status_manager.go:851] "Failed to get status for pod" podUID="c9017aa0-1c82-4753-b448-b07556e89259" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-59d967769f-g8465\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542554 4755 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.542913 4755 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.543429 4755 status_manager.go:851] "Failed to get status for pod" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" pod="openshift-marketplace/certified-operators-d8rq7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-d8rq7\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.544100 4755 status_manager.go:851] "Failed to get status for pod" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" pod="openshift-marketplace/redhat-operators-lkvql" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-lkvql\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.544614 4755 status_manager.go:851] "Failed to get status for pod" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.545252 4755 status_manager.go:851] "Failed to get status for pod" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" pod="openshift-marketplace/community-operators-cgznb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-cgznb\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.545728 4755 status_manager.go:851] "Failed to get status for pod" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" pod="openshift-marketplace/community-operators-vm24m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-vm24m\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:00 crc kubenswrapper[4755]: I0320 13:36:00.546193 4755 status_manager.go:851] "Failed to get status for pod" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" pod="openshift-marketplace/redhat-marketplace-nlslg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-nlslg\": dial tcp 38.102.83.181:6443: connect: connection refused" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.549932 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.552199 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.552303 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"11934a74662a01bf0f322f88ccd25f18ae746365df5aba2c83fb9bf72d79a6a6"} Mar 20 13:36:01 crc kubenswrapper[4755]: I0320 13:36:01.556222 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d7d6ec40aa203d17a1aab627d7d3551eccdc43a15d2e7018b643b749da4273b1"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.225951 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.231173 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6901cc8a30e124ad290689add14419d01602dcab403eb96bd0e010a281f78c19"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564306 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9c291fcf4504b3c8fd028f59d69391ac5e780fe39587e60eef518bb784028716"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7a9fbeddb52e58cf66cc8cd7609fa52e9c30f8ad74e2ee62ac749cf8edfe5eb4"} Mar 20 13:36:02 crc kubenswrapper[4755]: I0320 13:36:02.564610 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ffb3d5cc1b398ae6bc0b008cf6714afe6e3b0c0db380db488c104a7988c8d40a"} Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579188 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579234 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:03 crc kubenswrapper[4755]: I0320 13:36:03.579390 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.248920 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.249270 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:04 crc kubenswrapper[4755]: I0320 13:36:04.263560 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:05 crc kubenswrapper[4755]: I0320 13:36:05.052100 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:36:05 crc kubenswrapper[4755]: I0320 13:36:05.952812 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:36:07 crc kubenswrapper[4755]: I0320 13:36:07.698534 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.590370 4755 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.615118 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.615174 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.620602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.623560 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9f8b41c-3dfa-4aed-92f0-fe3c7dedcba8" Mar 20 13:36:08 crc kubenswrapper[4755]: I0320 13:36:08.671544 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:36:09 crc kubenswrapper[4755]: I0320 13:36:09.622211 4755 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:09 crc kubenswrapper[4755]: I0320 13:36:09.622256 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3c5a2944-296d-48ba-915d-640503b92beb" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.268249 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" containerID="cri-o://275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" gracePeriod=15 Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.634074 4755 generic.go:334] "Generic (PLEG): container finished" podID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerID="275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" exitCode=0 Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.634271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerDied","Data":"275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d"} Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.758809 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772674 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772712 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772738 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772898 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772918 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772938 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.772982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.773003 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") pod \"1ef1c7ef-1429-4467-abb5-837ad56896fb\" (UID: \"1ef1c7ef-1429-4467-abb5-837ad56896fb\") " Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.773492 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774138 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.774960 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.775583 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.781131 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.781877 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7" (OuterVolumeSpecName: "kube-api-access-vr2x7") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "kube-api-access-vr2x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.784187 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.784370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785450 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.785532 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.788121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.790918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1ef1c7ef-1429-4467-abb5-837ad56896fb" (UID: "1ef1c7ef-1429-4467-abb5-837ad56896fb"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874049 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874085 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874095 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874105 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr2x7\" (UniqueName: \"kubernetes.io/projected/1ef1c7ef-1429-4467-abb5-837ad56896fb-kube-api-access-vr2x7\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874115 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874126 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874135 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874145 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874153 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874161 4755 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874172 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874182 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874190 4755 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1ef1c7ef-1429-4467-abb5-837ad56896fb-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:10 crc kubenswrapper[4755]: I0320 13:36:10.874199 4755 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1ef1c7ef-1429-4467-abb5-837ad56896fb-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.255744 4755 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="a9f8b41c-3dfa-4aed-92f0-fe3c7dedcba8" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" event={"ID":"1ef1c7ef-1429-4467-abb5-837ad56896fb","Type":"ContainerDied","Data":"9edc35520733cdbb8ffbbdcc2f02ec6ef4e5e7ada3cc88f2fa7d388e53bb80dd"} Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643202 4755 scope.go:117] "RemoveContainer" containerID="275299b496016b6e538e77208132a3740983e0b2343d1b7b92eb23e1596b8b3d" Mar 20 13:36:11 crc kubenswrapper[4755]: I0320 13:36:11.643222 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-wpj5p" Mar 20 13:36:15 crc kubenswrapper[4755]: E0320 13:36:15.251823 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:36:16 crc kubenswrapper[4755]: E0320 13:36:16.505137 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:36:16 crc kubenswrapper[4755]: E0320 13:36:16.532051 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:36:17 crc kubenswrapper[4755]: I0320 13:36:17.484324 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.616997 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.795444 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.809430 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:36:18 crc kubenswrapper[4755]: I0320 13:36:18.954091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.139170 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.217179 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.246035 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.468541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:36:19 crc kubenswrapper[4755]: I0320 13:36:19.640096 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.077973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.121848 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.145064 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.297643 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.347165 4755 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.357284 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.501505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.782306 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.785043 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.790365 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:36:20 crc kubenswrapper[4755]: I0320 13:36:20.833727 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.117098 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.281014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.321191 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.364123 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.372908 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.390908 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.413756 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.514403 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.541136 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.568808 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:36:21 crc kubenswrapper[4755]: I0320 13:36:21.886364 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.119587 4755 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.132003 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.251560 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.281089 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.327717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.359038 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.431599 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.435963 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.699798 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.819205 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.915928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:36:22 crc kubenswrapper[4755]: I0320 13:36:22.950209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.034257 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.090327 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.121957 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.150450 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.187857 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.236625 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.247342 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.271045 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.305955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.306552 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.544509 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.639519 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.662346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.902596 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.957638 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4755]: I0320 13:36:23.958730 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.058268 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.076185 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.099973 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.134142 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.146007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.151417 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.183524 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.239498 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.303530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.385704 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.469729 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.503355 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.503817 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.507202 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.508803 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.859627 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.870672 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.927632 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.931398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:36:24 crc kubenswrapper[4755]: I0320 13:36:24.982724 4755 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.008281 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.073815 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.129204 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.197673 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.227982 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.257393 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.360928 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.382238 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.415979 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.464915 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.525029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.550162 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.565528 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.623218 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.686948 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.688397 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.793484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.845396 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:36:25 crc kubenswrapper[4755]: I0320 13:36:25.956377 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.043316 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.425942 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.560169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.838114 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.874526 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.882946 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.905236 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:36:26 crc kubenswrapper[4755]: I0320 13:36:26.983711 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.071382 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.209544 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.276285 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.280448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.446176 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.522548 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.529858 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.541505 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.552318 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.692782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.752574 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.804151 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.810853 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.831155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.839350 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.886006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.938740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.962769 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.982450 4755 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.984007 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.985342 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59d967769f-g8465" podStartSLOduration=45.985316454 podStartE2EDuration="45.985316454s" podCreationTimestamp="2026-03-20 13:35:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:07.83325629 +0000 UTC m=+347.431188829" watchObservedRunningTime="2026-03-20 13:36:27.985316454 +0000 UTC m=+367.583249013" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.987991 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=40.987971598 podStartE2EDuration="40.987971598s" podCreationTimestamp="2026-03-20 13:35:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:07.849053539 +0000 UTC m=+347.446986098" watchObservedRunningTime="2026-03-20 13:36:27.987971598 +0000 UTC m=+367.585904167" Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.993244 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-wpj5p","openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.993341 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:36:27 crc kubenswrapper[4755]: I0320 13:36:27.997975 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.001018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.021686 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.021634891 podStartE2EDuration="20.021634891s" podCreationTimestamp="2026-03-20 13:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:28.019951577 +0000 UTC m=+367.617884136" watchObservedRunningTime="2026-03-20 13:36:28.021634891 +0000 UTC m=+367.619567460" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.056524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.059123 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.095969 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.149482 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.186952 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.224920 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.225359 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.228815 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.413220 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.476077 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.496289 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.571183 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.582733 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.646361 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.727310 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:28 crc kubenswrapper[4755]: E0320 13:36:28.728030 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728080 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: E0320 13:36:28.728108 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728126 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728472 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0668fdb-be01-431d-9cbb-dabae6eb44e1" containerName="installer" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.728521 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" containerName="oauth-openshift" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.730437 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.732332 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733276 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733618 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733734 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.733867 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.737420 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.738350 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.738865 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739090 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739118 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.739325 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743267 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743679 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743907 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.743984 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.744740 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.793229 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.794077 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.796443 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.799099 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841815 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841856 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.841978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842107 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.842508 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.910576 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.921816 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.921881 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943183 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943237 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943296 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943391 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943416 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943444 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943493 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943550 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943608 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.943685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-service-ca\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945856 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-cliconfig\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.945917 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-policies\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.946032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/40e1333e-6ba5-4ac5-969b-06d408650a35-audit-dir\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.947455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955841 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-router-certs\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.955845 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-error\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962559 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.962587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-serving-cert\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.963150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-system-session\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.963314 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/40e1333e-6ba5-4ac5-969b-06d408650a35-v4-0-config-user-template-login\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.966877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"auto-csr-approver-29566896-bp947\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.975159 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdzr4\" (UniqueName: \"kubernetes.io/projected/40e1333e-6ba5-4ac5-969b-06d408650a35-kube-api-access-pdzr4\") pod \"oauth-openshift-66f8689f66-k824k\" (UID: \"40e1333e-6ba5-4ac5-969b-06d408650a35\") " pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:28 crc kubenswrapper[4755]: I0320 13:36:28.997604 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.077267 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.082679 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.085022 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.094227 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.116857 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.152178 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.173497 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.176251 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.235502 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ef1c7ef-1429-4467-abb5-837ad56896fb" path="/var/lib/kubelet/pods/1ef1c7ef-1429-4467-abb5-837ad56896fb/volumes" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.274687 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.285612 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.286614 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.355483 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.368169 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.534171 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.687514 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.705925 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.717585 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.823822 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.899050 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.907922 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.928079 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.959961 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:36:29 crc kubenswrapper[4755]: I0320 13:36:29.991092 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.141021 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.179938 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.225703 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.245692 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.282062 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.310793 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.385275 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.470598 4755 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.470995 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" gracePeriod=5 Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.476253 4755 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.481121 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.495587 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.504717 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.528013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.536645 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.611326 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.710778 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.727851 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.737087 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.784859 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.900268 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.945421 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:36:30 crc kubenswrapper[4755]: I0320 13:36:30.979725 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.080638 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.122476 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.166295 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.169565 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.210316 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.222581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.231547 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:36:31 crc kubenswrapper[4755]: W0320 13:36:31.243393 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e1333e_6ba5_4ac5_969b_06d408650a35.slice/crio-ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a WatchSource:0}: Error finding container ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a: Status 404 returned error can't find the container with id ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.255342 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-66f8689f66-k824k"] Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.410647 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.535029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.547520 4755 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.555512 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.634197 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.634194 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.792381 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.816669 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerStarted","Data":"23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.819152 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" event={"ID":"40e1333e-6ba5-4ac5-969b-06d408650a35","Type":"ContainerStarted","Data":"3d32abd23a34772ba75f29cee3392a59eea59802700a2a91776a6a69dcd6d646"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.819290 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" event={"ID":"40e1333e-6ba5-4ac5-969b-06d408650a35","Type":"ContainerStarted","Data":"ace94aeb7b709327d2915d4ce53515eb9de26eec8837eca29ea2b4010a20d86a"} Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.823224 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.836378 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.845532 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.856151 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" podStartSLOduration=46.856128034 podStartE2EDuration="46.856128034s" podCreationTimestamp="2026-03-20 13:35:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:36:31.852614522 +0000 UTC m=+371.450547091" watchObservedRunningTime="2026-03-20 13:36:31.856128034 +0000 UTC m=+371.454060573" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.946909 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:36:31 crc kubenswrapper[4755]: I0320 13:36:31.972545 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.016116 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.134535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.244092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-66f8689f66-k824k" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.299597 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.312782 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.587401 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.725210 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.830310 4755 generic.go:334] "Generic (PLEG): container finished" podID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerID="48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63" exitCode=0 Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.831177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerDied","Data":"48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63"} Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.855872 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.876035 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.886140 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.945706 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:36:32 crc kubenswrapper[4755]: I0320 13:36:32.952714 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.009187 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.024228 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.082573 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.087886 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.244737 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.482554 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.550097 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.614647 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.677205 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.819692 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.829227 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.880439 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.883317 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:36:33 crc kubenswrapper[4755]: I0320 13:36:33.982531 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.058509 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.156693 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.182111 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.287071 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.316260 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.347508 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") pod \"8532b92f-bed9-41b0-bf0d-99afa5703048\" (UID: \"8532b92f-bed9-41b0-bf0d-99afa5703048\") " Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.355784 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz" (OuterVolumeSpecName: "kube-api-access-56gzz") pod "8532b92f-bed9-41b0-bf0d-99afa5703048" (UID: "8532b92f-bed9-41b0-bf0d-99afa5703048"). InnerVolumeSpecName "kube-api-access-56gzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.448939 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56gzz\" (UniqueName: \"kubernetes.io/projected/8532b92f-bed9-41b0-bf0d-99afa5703048-kube-api-access-56gzz\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.614359 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.853445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-bp947" event={"ID":"8532b92f-bed9-41b0-bf0d-99afa5703048","Type":"ContainerDied","Data":"23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a"} Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.854042 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23522b550843dcadeda43ef47f2f27096cf312cd6c1b914cc16bf795845d690a" Mar 20 13:36:34 crc kubenswrapper[4755]: I0320 13:36:34.853610 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-bp947" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.016300 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.711246 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.864110 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.864197 4755 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" exitCode=137 Mar 20 13:36:35 crc kubenswrapper[4755]: I0320 13:36:35.946068 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.087209 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.087310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.283990 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284184 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284261 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284409 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284435 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.284606 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285000 4755 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285029 4755 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285047 4755 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.285065 4755 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.296721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.386543 4755 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875561 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875757 4755 scope.go:117] "RemoveContainer" containerID="0ab242e2282185039a9e0135229f6437921b750ab90074b2edbaf59e0bae32ab" Mar 20 13:36:36 crc kubenswrapper[4755]: I0320 13:36:36.875910 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.241792 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.242369 4755 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.257858 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.257945 4755 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="52cd97fa-790d-4793-945c-f2ccf7fc8986" Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.265331 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:36:37 crc kubenswrapper[4755]: I0320 13:36:37.265396 4755 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="52cd97fa-790d-4793-945c-f2ccf7fc8986" Mar 20 13:36:53 crc kubenswrapper[4755]: I0320 13:36:53.970908 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:53 crc kubenswrapper[4755]: I0320 13:36:53.972027 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d8rq7" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" containerID="cri-o://b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" gracePeriod=2 Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.169488 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.170249 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vm24m" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" containerID="cri-o://d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" gracePeriod=2 Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.469203 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.596339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.596477 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597409 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities" (OuterVolumeSpecName: "utilities") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597471 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") pod \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\" (UID: \"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.597860 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.606748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd" (OuterVolumeSpecName: "kube-api-access-8rpkd") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "kube-api-access-8rpkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.662056 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" (UID: "a751ac46-3f89-4d5a-8a23-0bbb3584dfa0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.680390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.700496 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.700915 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rpkd\" (UniqueName: \"kubernetes.io/projected/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0-kube-api-access-8rpkd\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802540 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802628 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.802726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") pod \"184aa529-45c4-42c9-8eee-04bd18fba718\" (UID: \"184aa529-45c4-42c9-8eee-04bd18fba718\") " Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.803765 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities" (OuterVolumeSpecName: "utilities") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.806878 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8" (OuterVolumeSpecName: "kube-api-access-pfxf8") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "kube-api-access-pfxf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.856969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "184aa529-45c4-42c9-8eee-04bd18fba718" (UID: "184aa529-45c4-42c9-8eee-04bd18fba718"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.904946 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfxf8\" (UniqueName: \"kubernetes.io/projected/184aa529-45c4-42c9-8eee-04bd18fba718-kube-api-access-pfxf8\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.905310 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:54 crc kubenswrapper[4755]: I0320 13:36:54.905435 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184aa529-45c4-42c9-8eee-04bd18fba718-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009229 4755 generic.go:334] "Generic (PLEG): container finished" podID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" exitCode=0 Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009330 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d8rq7" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009336 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009387 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d8rq7" event={"ID":"a751ac46-3f89-4d5a-8a23-0bbb3584dfa0","Type":"ContainerDied","Data":"b82219efa86cff3e92cd1609c0f3a02dacbb886afd0558266c139f378ee30512"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.009412 4755 scope.go:117] "RemoveContainer" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015148 4755 generic.go:334] "Generic (PLEG): container finished" podID="184aa529-45c4-42c9-8eee-04bd18fba718" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" exitCode=0 Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015263 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vm24m" event={"ID":"184aa529-45c4-42c9-8eee-04bd18fba718","Type":"ContainerDied","Data":"24590f57d7a71fc728e5565f5f244096a80e169677073f3c84f013acdef85509"} Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.015271 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vm24m" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.042378 4755 scope.go:117] "RemoveContainer" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.056834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.069434 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d8rq7"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.073365 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.076488 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vm24m"] Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.082554 4755 scope.go:117] "RemoveContainer" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.103381 4755 scope.go:117] "RemoveContainer" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.103864 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": container with ID starting with b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e not found: ID does not exist" containerID="b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.103993 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e"} err="failed to get container status \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": rpc error: code = NotFound desc = could not find container \"b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e\": container with ID starting with b849b767b5cd41793605f75ca036c69cad8b8ee5409df7c5eb7663faa87e1e9e not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104098 4755 scope.go:117] "RemoveContainer" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.104505 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": container with ID starting with 3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19 not found: ID does not exist" containerID="3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104545 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19"} err="failed to get container status \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": rpc error: code = NotFound desc = could not find container \"3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19\": container with ID starting with 3504b2161cd09ab605060ce308fa51646130e100e6dc518d6216a110c6816e19 not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.104580 4755 scope.go:117] "RemoveContainer" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.105715 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": container with ID starting with 707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d not found: ID does not exist" containerID="707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.105746 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d"} err="failed to get container status \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": rpc error: code = NotFound desc = could not find container \"707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d\": container with ID starting with 707a97fc195e2c8e0fd5906196efb0d4c406f877bdbee00eac0bf6b710ef1b3d not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.105763 4755 scope.go:117] "RemoveContainer" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.125274 4755 scope.go:117] "RemoveContainer" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.146567 4755 scope.go:117] "RemoveContainer" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.164714 4755 scope.go:117] "RemoveContainer" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.166256 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": container with ID starting with d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a not found: ID does not exist" containerID="d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166306 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a"} err="failed to get container status \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": rpc error: code = NotFound desc = could not find container \"d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a\": container with ID starting with d7c7e0c6c4cdb0f2304956974f930d62b52c2143a48b54ebad3fc9461ecb705a not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166341 4755 scope.go:117] "RemoveContainer" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.166625 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": container with ID starting with b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d not found: ID does not exist" containerID="b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166654 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d"} err="failed to get container status \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": rpc error: code = NotFound desc = could not find container \"b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d\": container with ID starting with b294edc842844a62c2e69044f89d9ecc532d0e74584c264947bea5235b38836d not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.166696 4755 scope.go:117] "RemoveContainer" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: E0320 13:36:55.167209 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": container with ID starting with a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef not found: ID does not exist" containerID="a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.167276 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef"} err="failed to get container status \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": rpc error: code = NotFound desc = could not find container \"a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef\": container with ID starting with a57432c35b604a0896e8f6a11a1a4473a6c2675d47b8816508a9cb2f0feb97ef not found: ID does not exist" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.232590 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" path="/var/lib/kubelet/pods/184aa529-45c4-42c9-8eee-04bd18fba718/volumes" Mar 20 13:36:55 crc kubenswrapper[4755]: I0320 13:36:55.233551 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" path="/var/lib/kubelet/pods/a751ac46-3f89-4d5a-8a23-0bbb3584dfa0/volumes" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.372553 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.373034 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nlslg" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" containerID="cri-o://a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" gracePeriod=2 Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.882928 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.941840 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.941903 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.942006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") pod \"ce4d5763-1786-4b87-8497-0c65da46f446\" (UID: \"ce4d5763-1786-4b87-8497-0c65da46f446\") " Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.943197 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities" (OuterVolumeSpecName: "utilities") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.949309 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx" (OuterVolumeSpecName: "kube-api-access-nblfx") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "kube-api-access-nblfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:56 crc kubenswrapper[4755]: I0320 13:36:56.994259 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce4d5763-1786-4b87-8497-0c65da46f446" (UID: "ce4d5763-1786-4b87-8497-0c65da46f446"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034149 4755 generic.go:334] "Generic (PLEG): container finished" podID="ce4d5763-1786-4b87-8497-0c65da46f446" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" exitCode=0 Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034221 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nlslg" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034240 4755 scope.go:117] "RemoveContainer" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.034228 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nlslg" event={"ID":"ce4d5763-1786-4b87-8497-0c65da46f446","Type":"ContainerDied","Data":"bb5a3902ca696b25e9dbeeedf5e8db5f8e3f8309ebd2a32415e02115d8dad95e"} Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045485 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nblfx\" (UniqueName: \"kubernetes.io/projected/ce4d5763-1786-4b87-8497-0c65da46f446-kube-api-access-nblfx\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045955 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.045969 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce4d5763-1786-4b87-8497-0c65da46f446-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.057846 4755 scope.go:117] "RemoveContainer" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.069607 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.074748 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nlslg"] Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.088226 4755 scope.go:117] "RemoveContainer" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101108 4755 scope.go:117] "RemoveContainer" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.101775 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": container with ID starting with a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2 not found: ID does not exist" containerID="a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101823 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2"} err="failed to get container status \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": rpc error: code = NotFound desc = could not find container \"a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2\": container with ID starting with a9bc4fd1ecd6fd6c6903a504be3a77869d5036efc4ee118ec14b3c485f8e54d2 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.101855 4755 scope.go:117] "RemoveContainer" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.102292 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": container with ID starting with b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98 not found: ID does not exist" containerID="b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102366 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98"} err="failed to get container status \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": rpc error: code = NotFound desc = could not find container \"b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98\": container with ID starting with b1e94e4f207492628740d8a1cb1fbb4865348cf23c04029b3c609b5bb9b45d98 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102408 4755 scope.go:117] "RemoveContainer" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: E0320 13:36:57.102856 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": container with ID starting with b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64 not found: ID does not exist" containerID="b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.102889 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64"} err="failed to get container status \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": rpc error: code = NotFound desc = could not find container \"b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64\": container with ID starting with b5d5c194447e1555f9de8ce5457f22b50fc082a78f35a36ab4b973f6f64a7b64 not found: ID does not exist" Mar 20 13:36:57 crc kubenswrapper[4755]: I0320 13:36:57.234884 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" path="/var/lib/kubelet/pods/ce4d5763-1786-4b87-8497-0c65da46f446/volumes" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.270425 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272355 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272395 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272431 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272452 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272500 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272520 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272545 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272576 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272594 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272626 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272646 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272704 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272722 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272753 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272772 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272803 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272820 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272852 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272871 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-content" Mar 20 13:37:11 crc kubenswrapper[4755]: E0320 13:37:11.272893 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.272912 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="extract-utilities" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273238 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" containerName="oc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273265 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4d5763-1786-4b87-8497-0c65da46f446" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273285 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="184aa529-45c4-42c9-8eee-04bd18fba718" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273300 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.273312 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a751ac46-3f89-4d5a-8a23-0bbb3584dfa0" containerName="registry-server" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.274119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.298261 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.359972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360271 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.360573 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.399738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461737 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461794 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461815 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461879 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.461910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.462810 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-ca-trust-extracted\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.463641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-certificates\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.464081 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-trusted-ca\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.469894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-installation-pull-secrets\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.470406 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-registry-tls\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.486138 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k92xm\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-kube-api-access-k92xm\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.493039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3f269bbf-e222-4a67-a28b-c02bdf9bb7b5-bound-sa-token\") pod \"image-registry-66df7c8f76-tflvc\" (UID: \"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5\") " pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:11 crc kubenswrapper[4755]: I0320 13:37:11.613862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:12 crc kubenswrapper[4755]: I0320 13:37:12.098762 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-tflvc"] Mar 20 13:37:12 crc kubenswrapper[4755]: I0320 13:37:12.145693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" event={"ID":"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5","Type":"ContainerStarted","Data":"5b1596ed1b6752a14b0f23edee121913b572c520f5dd7bb0b18894f636bfdfae"} Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.154693 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" event={"ID":"3f269bbf-e222-4a67-a28b-c02bdf9bb7b5","Type":"ContainerStarted","Data":"d3e00949a794bbe84cf9b546ffe377923acff43ce4c3c50704838528d8d3e89a"} Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.155237 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:13 crc kubenswrapper[4755]: I0320 13:37:13.194335 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" podStartSLOduration=2.194311542 podStartE2EDuration="2.194311542s" podCreationTimestamp="2026-03-20 13:37:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:13.188702768 +0000 UTC m=+412.786635357" watchObservedRunningTime="2026-03-20 13:37:13.194311542 +0000 UTC m=+412.792244071" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.619897 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.620850 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-shzbw" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" containerID="cri-o://bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.642717 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.643043 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cgznb" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" containerID="cri-o://4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.655808 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.656166 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" containerID="cri-o://5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.684735 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.685055 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-929x7" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" containerID="cri-o://2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.687446 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.687778 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-lkvql" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" containerID="cri-o://ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" gracePeriod=30 Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.695683 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.696404 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.696481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744424 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744480 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.744521 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.846269 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.849032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.857102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6d1fc18c-b364-439b-926f-12fe310d0917-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:28 crc kubenswrapper[4755]: I0320 13:37:28.868019 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k2bf\" (UniqueName: \"kubernetes.io/projected/6d1fc18c-b364-439b-926f-12fe310d0917-kube-api-access-9k2bf\") pod \"marketplace-operator-79b997595-ngw4b\" (UID: \"6d1fc18c-b364-439b-926f-12fe310d0917\") " pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.142855 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.148687 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.160801 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.165958 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.194335 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.218826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250429 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250492 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250519 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250564 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250593 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250622 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250699 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250741 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") pod \"e8e34571-6648-4e5e-b3e9-05f87454e19a\" (UID: \"e8e34571-6648-4e5e-b3e9-05f87454e19a\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250821 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250847 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") pod \"eca3198b-684d-4a52-b4aa-858ced996bae\" (UID: \"eca3198b-684d-4a52-b4aa-858ced996bae\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") pod \"2db67acd-25db-47a7-80ea-da4065a60e23\" (UID: \"2db67acd-25db-47a7-80ea-da4065a60e23\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250926 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") pod \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\" (UID: \"2d2017d2-f4ee-4056-b350-cc313f3faeaf\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.250953 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") pod \"887fa242-bd5e-40f5-8f6e-a81c6e976322\" (UID: \"887fa242-bd5e-40f5-8f6e-a81c6e976322\") " Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.259087 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb" (OuterVolumeSpecName: "kube-api-access-qsjvb") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "kube-api-access-qsjvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.259563 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6" (OuterVolumeSpecName: "kube-api-access-m5hm6") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "kube-api-access-m5hm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.260448 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.260747 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities" (OuterVolumeSpecName: "utilities") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.264524 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities" (OuterVolumeSpecName: "utilities") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265752 4755 generic.go:334] "Generic (PLEG): container finished" podID="eca3198b-684d-4a52-b4aa-858ced996bae" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerDied","Data":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.265962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" event={"ID":"eca3198b-684d-4a52-b4aa-858ced996bae","Type":"ContainerDied","Data":"ffd17bcea5582e9144ff86b2de342c1b3c61951742cefde886baf98d6e66252d"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.266031 4755 scope.go:117] "RemoveContainer" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.266138 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-229g6" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.267050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities" (OuterVolumeSpecName: "utilities") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.267196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities" (OuterVolumeSpecName: "utilities") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277272 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9" (OuterVolumeSpecName: "kube-api-access-w5qg9") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "kube-api-access-w5qg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277441 4755 generic.go:334] "Generic (PLEG): container finished" podID="2db67acd-25db-47a7-80ea-da4065a60e23" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277498 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.283773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-shzbw" event={"ID":"2db67acd-25db-47a7-80ea-da4065a60e23","Type":"ContainerDied","Data":"0ab76dafe853da1151a253ddbccefd2f71d9bf47c5abfc10da67278b7f81253e"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.280105 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "eca3198b-684d-4a52-b4aa-858ced996bae" (UID: "eca3198b-684d-4a52-b4aa-858ced996bae"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.280817 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5" (OuterVolumeSpecName: "kube-api-access-hchb5") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "kube-api-access-hchb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.277632 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-shzbw" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285006 4755 generic.go:334] "Generic (PLEG): container finished" podID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285128 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-929x7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285138 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-929x7" event={"ID":"2d2017d2-f4ee-4056-b350-cc313f3faeaf","Type":"ContainerDied","Data":"3878ee84a55f546e614e6295ef5c5640620ecb3a644bdf257cffd7dc5b2a3b27"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.285432 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc" (OuterVolumeSpecName: "kube-api-access-g2rrc") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "kube-api-access-g2rrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.290915 4755 scope.go:117] "RemoveContainer" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.293171 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": container with ID starting with 5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12 not found: ID does not exist" containerID="5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.293204 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12"} err="failed to get container status \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": rpc error: code = NotFound desc = could not find container \"5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12\": container with ID starting with 5c971b3fbaf6790e8cfb2af0e6cfbd4c723f59dfb724ac3b52d2894098ef6c12 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.293226 4755 scope.go:117] "RemoveContainer" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.294238 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d2017d2-f4ee-4056-b350-cc313f3faeaf" (UID: "2d2017d2-f4ee-4056-b350-cc313f3faeaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296267 4755 generic.go:334] "Generic (PLEG): container finished" podID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296397 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296451 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lkvql" event={"ID":"887fa242-bd5e-40f5-8f6e-a81c6e976322","Type":"ContainerDied","Data":"ea85ece18daec304b9cecefa9ca55b3c7ddbfc128e021ebc4bfd2b1a692b4346"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.296522 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lkvql" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301251 4755 generic.go:334] "Generic (PLEG): container finished" podID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" exitCode=0 Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cgznb" event={"ID":"e8e34571-6648-4e5e-b3e9-05f87454e19a","Type":"ContainerDied","Data":"28734b0d2914118b3d9d2819be5a8fd3a2768be1a04f071ed6cc45a5baf248f6"} Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.301356 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cgznb" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.309985 4755 scope.go:117] "RemoveContainer" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.334501 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e8e34571-6648-4e5e-b3e9-05f87454e19a" (UID: "e8e34571-6648-4e5e-b3e9-05f87454e19a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.338202 4755 scope.go:117] "RemoveContainer" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.343624 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2db67acd-25db-47a7-80ea-da4065a60e23" (UID: "2db67acd-25db-47a7-80ea-da4065a60e23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352370 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2rrc\" (UniqueName: \"kubernetes.io/projected/887fa242-bd5e-40f5-8f6e-a81c6e976322-kube-api-access-g2rrc\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352428 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352442 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w5qg9\" (UniqueName: \"kubernetes.io/projected/e8e34571-6648-4e5e-b3e9-05f87454e19a-kube-api-access-w5qg9\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352474 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352489 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hm6\" (UniqueName: \"kubernetes.io/projected/eca3198b-684d-4a52-b4aa-858ced996bae-kube-api-access-m5hm6\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352503 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352514 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352528 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsjvb\" (UniqueName: \"kubernetes.io/projected/2d2017d2-f4ee-4056-b350-cc313f3faeaf-kube-api-access-qsjvb\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352561 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hchb5\" (UniqueName: \"kubernetes.io/projected/2db67acd-25db-47a7-80ea-da4065a60e23-kube-api-access-hchb5\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352572 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8e34571-6648-4e5e-b3e9-05f87454e19a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352583 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352594 4755 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/eca3198b-684d-4a52-b4aa-858ced996bae-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352607 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2db67acd-25db-47a7-80ea-da4065a60e23-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.352633 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d2017d2-f4ee-4056-b350-cc313f3faeaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.361339 4755 scope.go:117] "RemoveContainer" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.362481 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": container with ID starting with bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7 not found: ID does not exist" containerID="bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.362546 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7"} err="failed to get container status \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": rpc error: code = NotFound desc = could not find container \"bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7\": container with ID starting with bf11ee961be31d418fe7ed3c42f09b4345bcf5dc3d034d5d472b76b20c59a6c7 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.362583 4755 scope.go:117] "RemoveContainer" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.363001 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": container with ID starting with 76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12 not found: ID does not exist" containerID="76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363048 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12"} err="failed to get container status \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": rpc error: code = NotFound desc = could not find container \"76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12\": container with ID starting with 76e3418a0ff2f5553edacb25eb4193d3c2429b6f311e7f9575ea25725f2cfc12 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363082 4755 scope.go:117] "RemoveContainer" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.363387 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": container with ID starting with 6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2 not found: ID does not exist" containerID="6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363499 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2"} err="failed to get container status \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": rpc error: code = NotFound desc = could not find container \"6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2\": container with ID starting with 6fef83cffb656e5360264e91faadfa3e5059e1d7f4712a5f75f13e0d802c20e2 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.363572 4755 scope.go:117] "RemoveContainer" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.400702 4755 scope.go:117] "RemoveContainer" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.410082 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ngw4b"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.417894 4755 scope.go:117] "RemoveContainer" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: W0320 13:37:29.421209 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d1fc18c_b364_439b_926f_12fe310d0917.slice/crio-2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f WatchSource:0}: Error finding container 2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f: Status 404 returned error can't find the container with id 2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439292 4755 scope.go:117] "RemoveContainer" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.439913 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": container with ID starting with 2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247 not found: ID does not exist" containerID="2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439957 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247"} err="failed to get container status \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": rpc error: code = NotFound desc = could not find container \"2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247\": container with ID starting with 2a843ab1947c6495004024b4b0e59229a31d4ceda4a0d372ab5ef0424f992247 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.439985 4755 scope.go:117] "RemoveContainer" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.440441 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": container with ID starting with 49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de not found: ID does not exist" containerID="49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.440505 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de"} err="failed to get container status \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": rpc error: code = NotFound desc = could not find container \"49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de\": container with ID starting with 49b77dd8964b2f122f59cbc4b3ede4119c7f421e9719d47a56cbd80120bbb4de not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.440550 4755 scope.go:117] "RemoveContainer" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.441588 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": container with ID starting with e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857 not found: ID does not exist" containerID="e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.441614 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857"} err="failed to get container status \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": rpc error: code = NotFound desc = could not find container \"e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857\": container with ID starting with e7190b036f4f470edea59dedcce12aacb09fce7f53a09198660a7c0fb0392857 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.441630 4755 scope.go:117] "RemoveContainer" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.457369 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "887fa242-bd5e-40f5-8f6e-a81c6e976322" (UID: "887fa242-bd5e-40f5-8f6e-a81c6e976322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.457788 4755 scope.go:117] "RemoveContainer" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.488531 4755 scope.go:117] "RemoveContainer" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.519871 4755 scope.go:117] "RemoveContainer" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520262 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": container with ID starting with ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520 not found: ID does not exist" containerID="ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520296 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520"} err="failed to get container status \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": rpc error: code = NotFound desc = could not find container \"ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520\": container with ID starting with ccd8c5942dd3972af496744c730227dfb074fd2e55bea78a83f409fce9670520 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520317 4755 scope.go:117] "RemoveContainer" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520556 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": container with ID starting with 1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85 not found: ID does not exist" containerID="1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520588 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85"} err="failed to get container status \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": rpc error: code = NotFound desc = could not find container \"1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85\": container with ID starting with 1c14c27bc18332c3b22b72a9dce17d97ef4727b245c7f67e7c86c530848d1b85 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520606 4755 scope.go:117] "RemoveContainer" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.520829 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": container with ID starting with 388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad not found: ID does not exist" containerID="388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520849 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad"} err="failed to get container status \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": rpc error: code = NotFound desc = could not find container \"388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad\": container with ID starting with 388a4722fb9668f2f3df076f270cda847725e0a7c7b387d3fb564b2c2581adad not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.520866 4755 scope.go:117] "RemoveContainer" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.539111 4755 scope.go:117] "RemoveContainer" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.554793 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/887fa242-bd5e-40f5-8f6e-a81c6e976322-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.559280 4755 scope.go:117] "RemoveContainer" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.571945 4755 scope.go:117] "RemoveContainer" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.572234 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": container with ID starting with 4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476 not found: ID does not exist" containerID="4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572279 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476"} err="failed to get container status \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": rpc error: code = NotFound desc = could not find container \"4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476\": container with ID starting with 4227ee3babc30056ac74599750210bc824661531baef8599aec979f680890476 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572303 4755 scope.go:117] "RemoveContainer" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.572642 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": container with ID starting with a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee not found: ID does not exist" containerID="a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572754 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee"} err="failed to get container status \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": rpc error: code = NotFound desc = could not find container \"a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee\": container with ID starting with a1f29e76c1427d5de991b3f526e4b893c28e99eea8bfc2d83ed9a557837788ee not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.572804 4755 scope.go:117] "RemoveContainer" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: E0320 13:37:29.573365 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": container with ID starting with eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703 not found: ID does not exist" containerID="eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.573400 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703"} err="failed to get container status \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": rpc error: code = NotFound desc = could not find container \"eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703\": container with ID starting with eba742748cff448e5851fb230988ba9d284a33ce0d75c43cfffe7b1b0cd95703 not found: ID does not exist" Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.596778 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.600245 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-229g6"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.630006 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.633311 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-929x7"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.648224 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.668601 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-lkvql"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.694037 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.702852 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-shzbw"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.705668 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:29 crc kubenswrapper[4755]: I0320 13:37:29.709123 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cgznb"] Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.236889 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237203 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237241 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237254 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237273 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237286 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237307 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237320 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237340 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237354 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237369 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237383 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237399 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237410 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237433 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237446 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237493 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237508 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237521 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="extract-utilities" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237540 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237553 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237567 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237580 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="extract-content" Mar 20 13:37:30 crc kubenswrapper[4755]: E0320 13:37:30.237598 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237609 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237781 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" containerName="marketplace-operator" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237800 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237816 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237836 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.237863 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" containerName="registry-server" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.239087 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.242478 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.254075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289091 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.289163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312294 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" event={"ID":"6d1fc18c-b364-439b-926f-12fe310d0917","Type":"ContainerStarted","Data":"7f2ba372670391d5fcf019a9e918249106823165de1f1a45210f90b435f1c486"} Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312342 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" event={"ID":"6d1fc18c-b364-439b-926f-12fe310d0917","Type":"ContainerStarted","Data":"2761d94079f866b1a210105e17f1b45692d3b24305664b73604cd4f53fb2504f"} Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.312473 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.316672 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.332509 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ngw4b" podStartSLOduration=2.332469686 podStartE2EDuration="2.332469686s" podCreationTimestamp="2026-03-20 13:37:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:37:30.33044893 +0000 UTC m=+429.928381459" watchObservedRunningTime="2026-03-20 13:37:30.332469686 +0000 UTC m=+429.930402225" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390337 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390454 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.390500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.391610 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-catalog-content\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.392814 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1107b669-3bdf-4189-a37a-b79ddb758fff-utilities\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.426676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq68d\" (UniqueName: \"kubernetes.io/projected/1107b669-3bdf-4189-a37a-b79ddb758fff-kube-api-access-lq68d\") pod \"redhat-marketplace-srzwn\" (UID: \"1107b669-3bdf-4189-a37a-b79ddb758fff\") " pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:30 crc kubenswrapper[4755]: I0320 13:37:30.555022 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.007062 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-srzwn"] Mar 20 13:37:31 crc kubenswrapper[4755]: W0320 13:37:31.014688 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1107b669_3bdf_4189_a37a_b79ddb758fff.slice/crio-355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f WatchSource:0}: Error finding container 355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f: Status 404 returned error can't find the container with id 355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.246279 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d2017d2-f4ee-4056-b350-cc313f3faeaf" path="/var/lib/kubelet/pods/2d2017d2-f4ee-4056-b350-cc313f3faeaf/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.250481 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2db67acd-25db-47a7-80ea-da4065a60e23" path="/var/lib/kubelet/pods/2db67acd-25db-47a7-80ea-da4065a60e23/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.251380 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="887fa242-bd5e-40f5-8f6e-a81c6e976322" path="/var/lib/kubelet/pods/887fa242-bd5e-40f5-8f6e-a81c6e976322/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.252232 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8e34571-6648-4e5e-b3e9-05f87454e19a" path="/var/lib/kubelet/pods/e8e34571-6648-4e5e-b3e9-05f87454e19a/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.253089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca3198b-684d-4a52-b4aa-858ced996bae" path="/var/lib/kubelet/pods/eca3198b-684d-4a52-b4aa-858ced996bae/volumes" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.253641 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.256372 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.256541 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.264608 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.303859 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325271 4755 generic.go:334] "Generic (PLEG): container finished" podID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerID="ce76a7d0c9414180f481727757d7b93f76b91fd1dd18b729c9b739307d7f3e2f" exitCode=0 Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerDied","Data":"ce76a7d0c9414180f481727757d7b93f76b91fd1dd18b729c9b739307d7f3e2f"} Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.325444 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerStarted","Data":"355bd1bc49f9958785e94d7445aa5d4a92b13da23c4ca2c7acdf1cc66635406f"} Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.405932 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406739 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.406796 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-utilities\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.407034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f483e049-5032-496f-8608-494e07922763-catalog-content\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.433296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7pl\" (UniqueName: \"kubernetes.io/projected/f483e049-5032-496f-8608-494e07922763-kube-api-access-fp7pl\") pod \"redhat-operators-c9s6q\" (UID: \"f483e049-5032-496f-8608-494e07922763\") " pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.620027 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-tflvc" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.621207 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:31 crc kubenswrapper[4755]: I0320 13:37:31.711250 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.134813 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-c9s6q"] Mar 20 13:37:32 crc kubenswrapper[4755]: W0320 13:37:32.138199 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf483e049_5032_496f_8608_494e07922763.slice/crio-b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca WatchSource:0}: Error finding container b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca: Status 404 returned error can't find the container with id b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333181 4755 generic.go:334] "Generic (PLEG): container finished" podID="f483e049-5032-496f-8608-494e07922763" containerID="4a6873996a090c1ab71ee644f4d8f0225205dd1c16487fa9851d034e5bc18c2a" exitCode=0 Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerDied","Data":"4a6873996a090c1ab71ee644f4d8f0225205dd1c16487fa9851d034e5bc18c2a"} Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.333909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"b77ce34b6fb663fd8653822dc239412319c821482b03eea7278da8a76a03b5ca"} Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.639990 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.641637 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.647755 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.652930 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729442 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729504 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.729525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831394 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.831941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-catalog-content\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.832048 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/504e1957-f41e-4927-927f-d5ac7e8eb625-utilities\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.857071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdwwd\" (UniqueName: \"kubernetes.io/projected/504e1957-f41e-4927-927f-d5ac7e8eb625-kube-api-access-zdwwd\") pod \"community-operators-6g8x4\" (UID: \"504e1957-f41e-4927-927f-d5ac7e8eb625\") " pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:32 crc kubenswrapper[4755]: I0320 13:37:32.980435 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.342579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694"} Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.345205 4755 generic.go:334] "Generic (PLEG): container finished" podID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerID="3e7af82c28d3451b21b1a29643459a1913ef0639ea44724458fcd3ef408c61b7" exitCode=0 Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.345298 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerDied","Data":"3e7af82c28d3451b21b1a29643459a1913ef0639ea44724458fcd3ef408c61b7"} Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.481245 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6g8x4"] Mar 20 13:37:33 crc kubenswrapper[4755]: W0320 13:37:33.486624 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod504e1957_f41e_4927_927f_d5ac7e8eb625.slice/crio-49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78 WatchSource:0}: Error finding container 49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78: Status 404 returned error can't find the container with id 49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78 Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.634224 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.636352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.641102 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.646728 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759718 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.759783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861451 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861481 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.861907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-utilities\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.862032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b421640-e220-4567-8600-8e0ba78a981a-catalog-content\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.888094 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfkcc\" (UniqueName: \"kubernetes.io/projected/2b421640-e220-4567-8600-8e0ba78a981a-kube-api-access-qfkcc\") pod \"certified-operators-nql9k\" (UID: \"2b421640-e220-4567-8600-8e0ba78a981a\") " pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:33 crc kubenswrapper[4755]: I0320 13:37:33.969031 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.206197 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nql9k"] Mar 20 13:37:34 crc kubenswrapper[4755]: W0320 13:37:34.212941 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b421640_e220_4567_8600_8e0ba78a981a.slice/crio-e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862 WatchSource:0}: Error finding container e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862: Status 404 returned error can't find the container with id e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.355182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-srzwn" event={"ID":"1107b669-3bdf-4189-a37a-b79ddb758fff","Type":"ContainerStarted","Data":"770d69d023566d2bc06547337ff64b3e1944607bd793540cdd3283051a76262e"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356516 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b421640-e220-4567-8600-8e0ba78a981a" containerID="6e7ad26e08ee1cb8fc5d82c6121820f29a80fe7890ca59176dcc322f331b168d" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356578 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerDied","Data":"6e7ad26e08ee1cb8fc5d82c6121820f29a80fe7890ca59176dcc322f331b168d"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.356613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"e5222aa61d116b3e3c253e78ce3af7191091ff2bef4584c960ff6156a5d0e862"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360387 4755 generic.go:334] "Generic (PLEG): container finished" podID="504e1957-f41e-4927-927f-d5ac7e8eb625" containerID="696a395a41fc5c2cad8e0190521f19c702e94a769590b9c0630323b307518eaf" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360734 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerDied","Data":"696a395a41fc5c2cad8e0190521f19c702e94a769590b9c0630323b307518eaf"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.360808 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"49c6d2a38a0f35e2b30b89c82bfb4de171c31d7965784df88a2b53877dfaeb78"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.364548 4755 generic.go:334] "Generic (PLEG): container finished" podID="f483e049-5032-496f-8608-494e07922763" containerID="e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694" exitCode=0 Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.364595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerDied","Data":"e8572116327aa3808e0daf8afd36ba21267844e96c9401b6c9a39cd1d7218694"} Mar 20 13:37:34 crc kubenswrapper[4755]: I0320 13:37:34.403035 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-srzwn" podStartSLOduration=1.85891135 podStartE2EDuration="4.403008743s" podCreationTimestamp="2026-03-20 13:37:30 +0000 UTC" firstStartedPulling="2026-03-20 13:37:31.32642831 +0000 UTC m=+430.924360849" lastFinishedPulling="2026-03-20 13:37:33.870525723 +0000 UTC m=+433.468458242" observedRunningTime="2026-03-20 13:37:34.379866215 +0000 UTC m=+433.977798744" watchObservedRunningTime="2026-03-20 13:37:34.403008743 +0000 UTC m=+434.000941272" Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.371982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.374957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-c9s6q" event={"ID":"f483e049-5032-496f-8608-494e07922763","Type":"ContainerStarted","Data":"3dc34b30bf3966ed77988b050ebc2a79979f98ab189b065770efcef17170409f"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.377204 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0"} Mar 20 13:37:35 crc kubenswrapper[4755]: I0320 13:37:35.416864 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-c9s6q" podStartSLOduration=1.933811737 podStartE2EDuration="4.416837836s" podCreationTimestamp="2026-03-20 13:37:31 +0000 UTC" firstStartedPulling="2026-03-20 13:37:32.335284206 +0000 UTC m=+431.933216735" lastFinishedPulling="2026-03-20 13:37:34.818310305 +0000 UTC m=+434.416242834" observedRunningTime="2026-03-20 13:37:35.41266815 +0000 UTC m=+435.010600699" watchObservedRunningTime="2026-03-20 13:37:35.416837836 +0000 UTC m=+435.014770365" Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.390205 4755 generic.go:334] "Generic (PLEG): container finished" podID="2b421640-e220-4567-8600-8e0ba78a981a" containerID="6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b" exitCode=0 Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.390289 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerDied","Data":"6aae85209875e48f8512f2703a3762a52951a87f9743bd080810cdcb9a97dd2b"} Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.393447 4755 generic.go:334] "Generic (PLEG): container finished" podID="504e1957-f41e-4927-927f-d5ac7e8eb625" containerID="628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0" exitCode=0 Mar 20 13:37:36 crc kubenswrapper[4755]: I0320 13:37:36.395573 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerDied","Data":"628783d49078f423d100ac8dd4fa416c59eff99f8730012f920ca8a3ea473db0"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.405971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nql9k" event={"ID":"2b421640-e220-4567-8600-8e0ba78a981a","Type":"ContainerStarted","Data":"341d50118145d4c32134de2378d95edc5af47e2018c48c3096921a2849e7a30e"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.408768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6g8x4" event={"ID":"504e1957-f41e-4927-927f-d5ac7e8eb625","Type":"ContainerStarted","Data":"eec474a6b04a7c91815a071ff884e25ff0c39cc60f17c3469adceb3d7ee6d1f7"} Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.425166 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nql9k" podStartSLOduration=1.913511437 podStartE2EDuration="4.425146896s" podCreationTimestamp="2026-03-20 13:37:33 +0000 UTC" firstStartedPulling="2026-03-20 13:37:34.357767316 +0000 UTC m=+433.955699835" lastFinishedPulling="2026-03-20 13:37:36.869402745 +0000 UTC m=+436.467335294" observedRunningTime="2026-03-20 13:37:37.423859271 +0000 UTC m=+437.021791810" watchObservedRunningTime="2026-03-20 13:37:37.425146896 +0000 UTC m=+437.023079425" Mar 20 13:37:37 crc kubenswrapper[4755]: I0320 13:37:37.451713 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6g8x4" podStartSLOduration=2.97094628 podStartE2EDuration="5.451688537s" podCreationTimestamp="2026-03-20 13:37:32 +0000 UTC" firstStartedPulling="2026-03-20 13:37:34.362083436 +0000 UTC m=+433.960015965" lastFinishedPulling="2026-03-20 13:37:36.842825683 +0000 UTC m=+436.440758222" observedRunningTime="2026-03-20 13:37:37.447168842 +0000 UTC m=+437.045101401" watchObservedRunningTime="2026-03-20 13:37:37.451688537 +0000 UTC m=+437.049621066" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.556091 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.556199 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:40 crc kubenswrapper[4755]: I0320 13:37:40.601997 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.518645 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-srzwn" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.621987 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:41 crc kubenswrapper[4755]: I0320 13:37:41.622055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.677821 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-c9s6q" podUID="f483e049-5032-496f-8608-494e07922763" containerName="registry-server" probeResult="failure" output=< Mar 20 13:37:42 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:37:42 crc kubenswrapper[4755]: > Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.981516 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:42 crc kubenswrapper[4755]: I0320 13:37:42.981592 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.052934 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.504838 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6g8x4" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.970210 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:43 crc kubenswrapper[4755]: I0320 13:37:43.970309 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:44 crc kubenswrapper[4755]: I0320 13:37:44.025100 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:44 crc kubenswrapper[4755]: I0320 13:37:44.532634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nql9k" Mar 20 13:37:51 crc kubenswrapper[4755]: I0320 13:37:51.677393 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:51 crc kubenswrapper[4755]: I0320 13:37:51.737355 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-c9s6q" Mar 20 13:37:56 crc kubenswrapper[4755]: I0320 13:37:56.752679 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" containerID="cri-o://13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" gracePeriod=30 Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.178247 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314457 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314573 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314636 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314904 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314933 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314958 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.314996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.315048 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") pod \"408c6869-42d8-4cbc-a261-57fb45f0d666\" (UID: \"408c6869-42d8-4cbc-a261-57fb45f0d666\") " Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.315647 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.316918 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.325922 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm" (OuterVolumeSpecName: "kube-api-access-gpvlm") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "kube-api-access-gpvlm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.326749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.328842 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.329022 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.334764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.336907 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "408c6869-42d8-4cbc-a261-57fb45f0d666" (UID: "408c6869-42d8-4cbc-a261-57fb45f0d666"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416669 4755 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/408c6869-42d8-4cbc-a261-57fb45f0d666-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416708 4755 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/408c6869-42d8-4cbc-a261-57fb45f0d666-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416718 4755 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416735 4755 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416746 4755 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416754 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/408c6869-42d8-4cbc-a261-57fb45f0d666-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.416762 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpvlm\" (UniqueName: \"kubernetes.io/projected/408c6869-42d8-4cbc-a261-57fb45f0d666-kube-api-access-gpvlm\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552207 4755 generic.go:334] "Generic (PLEG): container finished" podID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" exitCode=0 Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerDied","Data":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552309 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552338 4755 scope.go:117] "RemoveContainer" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.552322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-bckdl" event={"ID":"408c6869-42d8-4cbc-a261-57fb45f0d666","Type":"ContainerDied","Data":"500c17d3b1ce4928afc1c5dda574edfd0b6a6075accfcdd8bfeb7e1c6f63f4a6"} Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.570586 4755 scope.go:117] "RemoveContainer" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: E0320 13:37:57.571043 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": container with ID starting with 13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3 not found: ID does not exist" containerID="13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.571071 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3"} err="failed to get container status \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": rpc error: code = NotFound desc = could not find container \"13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3\": container with ID starting with 13ca4b33da3f860eb2651fefe9f902fa37d24e30caad2a0f42b2467cc512f4a3 not found: ID does not exist" Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.587775 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:57 crc kubenswrapper[4755]: I0320 13:37:57.592145 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-bckdl"] Mar 20 13:37:59 crc kubenswrapper[4755]: I0320 13:37:59.231627 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" path="/var/lib/kubelet/pods/408c6869-42d8-4cbc-a261-57fb45f0d666/volumes" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133576 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: E0320 13:38:00.133848 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133861 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.133958 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="408c6869-42d8-4cbc-a261-57fb45f0d666" containerName="registry" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.134330 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.136283 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.136561 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.137315 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.149418 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.258207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.258784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.259066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.259546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.268771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.361239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.383374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"auto-csr-approver-29566898-pbw9z\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.463676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.526974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:38:00 crc kubenswrapper[4755]: I0320 13:38:00.946718 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:38:00 crc kubenswrapper[4755]: W0320 13:38:00.953849 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb576c19_7f49_40ac_987b_5eefb5db31ce.slice/crio-0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25 WatchSource:0}: Error finding container 0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25: Status 404 returned error can't find the container with id 0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25 Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.378404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.378581 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.385818 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.386422 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.526185 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.526209 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.603466 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0a971fb54dae1dc6dd5a7cb86fdb33719d285d5997bdc789a750b5489dac589f"} Mar 20 13:38:01 crc kubenswrapper[4755]: I0320 13:38:01.606197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerStarted","Data":"0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25"} Mar 20 13:38:01 crc kubenswrapper[4755]: W0320 13:38:01.893986 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad WatchSource:0}: Error finding container ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad: Status 404 returned error can't find the container with id ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad Mar 20 13:38:02 crc kubenswrapper[4755]: W0320 13:38:02.088248 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980 WatchSource:0}: Error finding container 2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980: Status 404 returned error can't find the container with id 2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980 Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.613873 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"14c03d5bb6f71fb61256b58cc54b877069e5a78886068524dc350fc5cfb18820"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.615399 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fddd2f491ddff3a4b7735de2e6eac05470ce4139d55203831b9c334a4a28de32"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.615431 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"2484210464bdce6049c2354d3665486403b607e01f150fd4865cdce7909dd980"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.616144 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.617003 4755 generic.go:334] "Generic (PLEG): container finished" podID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerID="0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc" exitCode=0 Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.617129 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerDied","Data":"0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.618761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"aab4e784fd739fffb08708c0e260c074b42ae552f5d0712d1fd356ef51556faf"} Mar 20 13:38:02 crc kubenswrapper[4755]: I0320 13:38:02.618793 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ae39f02db5aaa4bb74ee00083f66e61b86175b9190b9c27d4eb41348b19a2dad"} Mar 20 13:38:03 crc kubenswrapper[4755]: I0320 13:38:03.880762 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.013205 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") pod \"cb576c19-7f49-40ac-987b-5eefb5db31ce\" (UID: \"cb576c19-7f49-40ac-987b-5eefb5db31ce\") " Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.019070 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf" (OuterVolumeSpecName: "kube-api-access-nrnnf") pod "cb576c19-7f49-40ac-987b-5eefb5db31ce" (UID: "cb576c19-7f49-40ac-987b-5eefb5db31ce"). InnerVolumeSpecName "kube-api-access-nrnnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.113999 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnnf\" (UniqueName: \"kubernetes.io/projected/cb576c19-7f49-40ac-987b-5eefb5db31ce-kube-api-access-nrnnf\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.640922 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" event={"ID":"cb576c19-7f49-40ac-987b-5eefb5db31ce","Type":"ContainerDied","Data":"0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25"} Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.640966 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ec995d63b4d50ed626fe3030cd6651d8b438c94c84af9d51ad66f2b27725e25" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.641351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-pbw9z" Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.949797 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:38:04 crc kubenswrapper[4755]: I0320 13:38:04.953868 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xh9lg"] Mar 20 13:38:05 crc kubenswrapper[4755]: I0320 13:38:05.234832 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28deea0d-d80e-422b-a0c2-40670570aa68" path="/var/lib/kubelet/pods/28deea0d-d80e-422b-a0c2-40670570aa68/volumes" Mar 20 13:38:06 crc kubenswrapper[4755]: I0320 13:38:06.751445 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:06 crc kubenswrapper[4755]: I0320 13:38:06.751519 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:36 crc kubenswrapper[4755]: I0320 13:38:36.751971 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:36 crc kubenswrapper[4755]: I0320 13:38:36.753802 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:41 crc kubenswrapper[4755]: I0320 13:38:41.533188 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.756061 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.757068 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.757143 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.758017 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:39:06 crc kubenswrapper[4755]: I0320 13:39:06.758120 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" gracePeriod=600 Mar 20 13:39:06 crc kubenswrapper[4755]: E0320 13:39:06.800486 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3eb406f6_1a26_4eea_84ac_e55f5232900c.slice/crio-d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099142 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" exitCode=0 Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab"} Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099758 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} Mar 20 13:39:07 crc kubenswrapper[4755]: I0320 13:39:07.099802 4755 scope.go:117] "RemoveContainer" containerID="bcf41d6132d7c7c5c63919a40fcf22abd2adc9e2a7a57c2ffc6c11a609eca280" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.155019 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: E0320 13:40:00.156731 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.156766 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.156980 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" containerName="oc" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.158264 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.160931 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.161161 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.162014 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.162143 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.337321 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.439236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.472976 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"auto-csr-approver-29566900-nvf9d\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.499308 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.773970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:40:00 crc kubenswrapper[4755]: I0320 13:40:00.785639 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:40:01 crc kubenswrapper[4755]: I0320 13:40:01.525297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerStarted","Data":"14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783"} Mar 20 13:40:02 crc kubenswrapper[4755]: I0320 13:40:02.537728 4755 generic.go:334] "Generic (PLEG): container finished" podID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerID="5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6" exitCode=0 Mar 20 13:40:02 crc kubenswrapper[4755]: I0320 13:40:02.537884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerDied","Data":"5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6"} Mar 20 13:40:03 crc kubenswrapper[4755]: I0320 13:40:03.878899 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:03.997542 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") pod \"6af9d427-765f-4d25-9603-e0b39103e2cc\" (UID: \"6af9d427-765f-4d25-9603-e0b39103e2cc\") " Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.008689 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc" (OuterVolumeSpecName: "kube-api-access-jsvfc") pod "6af9d427-765f-4d25-9603-e0b39103e2cc" (UID: "6af9d427-765f-4d25-9603-e0b39103e2cc"). InnerVolumeSpecName "kube-api-access-jsvfc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.099694 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsvfc\" (UniqueName: \"kubernetes.io/projected/6af9d427-765f-4d25-9603-e0b39103e2cc-kube-api-access-jsvfc\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" event={"ID":"6af9d427-765f-4d25-9603-e0b39103e2cc","Type":"ContainerDied","Data":"14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783"} Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559823 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e4905417f84cf04080b971ec51719d500b6a85c41b7b0c397dc676f500f783" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.559895 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-nvf9d" Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.955080 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:40:04 crc kubenswrapper[4755]: I0320 13:40:04.959118 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-tzlc5"] Mar 20 13:40:05 crc kubenswrapper[4755]: I0320 13:40:05.241968 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34" path="/var/lib/kubelet/pods/d5d2d1af-dd24-4dd7-8f44-1e89d27e6d34/volumes" Mar 20 13:41:22 crc kubenswrapper[4755]: I0320 13:41:22.088835 4755 scope.go:117] "RemoveContainer" containerID="5b3f54c94a347a6034caa942913ea3e1af42b972d18f5799f2f913f58379470a" Mar 20 13:41:22 crc kubenswrapper[4755]: I0320 13:41:22.137185 4755 scope.go:117] "RemoveContainer" containerID="6e468078e481cdcc9bfa393977db88e8643d1ca19ffa94b078f20bdb71bbb6c9" Mar 20 13:41:36 crc kubenswrapper[4755]: I0320 13:41:36.751561 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:36 crc kubenswrapper[4755]: I0320 13:41:36.753813 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.153821 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:00 crc kubenswrapper[4755]: E0320 13:42:00.154975 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.154994 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.155123 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" containerName="oc" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.155645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.159868 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.161628 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.167432 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.199514 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.251166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.352856 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.383807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"auto-csr-approver-29566902-zh8v6\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.518524 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:00 crc kubenswrapper[4755]: I0320 13:42:00.795844 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:42:01 crc kubenswrapper[4755]: I0320 13:42:01.492450 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerStarted","Data":"2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9"} Mar 20 13:42:02 crc kubenswrapper[4755]: I0320 13:42:02.501010 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerStarted","Data":"dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac"} Mar 20 13:42:02 crc kubenswrapper[4755]: I0320 13:42:02.523407 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" podStartSLOduration=1.191208469 podStartE2EDuration="2.523380372s" podCreationTimestamp="2026-03-20 13:42:00 +0000 UTC" firstStartedPulling="2026-03-20 13:42:00.811746822 +0000 UTC m=+700.409679351" lastFinishedPulling="2026-03-20 13:42:02.143918685 +0000 UTC m=+701.741851254" observedRunningTime="2026-03-20 13:42:02.51817238 +0000 UTC m=+702.116104919" watchObservedRunningTime="2026-03-20 13:42:02.523380372 +0000 UTC m=+702.121312911" Mar 20 13:42:03 crc kubenswrapper[4755]: I0320 13:42:03.509574 4755 generic.go:334] "Generic (PLEG): container finished" podID="320783b7-7554-4157-b6cd-143d787dc30b" containerID="dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac" exitCode=0 Mar 20 13:42:03 crc kubenswrapper[4755]: I0320 13:42:03.509708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerDied","Data":"dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac"} Mar 20 13:42:04 crc kubenswrapper[4755]: I0320 13:42:04.854396 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.025304 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") pod \"320783b7-7554-4157-b6cd-143d787dc30b\" (UID: \"320783b7-7554-4157-b6cd-143d787dc30b\") " Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.034818 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj" (OuterVolumeSpecName: "kube-api-access-ss9kj") pod "320783b7-7554-4157-b6cd-143d787dc30b" (UID: "320783b7-7554-4157-b6cd-143d787dc30b"). InnerVolumeSpecName "kube-api-access-ss9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.127428 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss9kj\" (UniqueName: \"kubernetes.io/projected/320783b7-7554-4157-b6cd-143d787dc30b-kube-api-access-ss9kj\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.527531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" event={"ID":"320783b7-7554-4157-b6cd-143d787dc30b","Type":"ContainerDied","Data":"2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9"} Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.528140 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bbfbe341e14e25d280dea19851552932ed939e648b79b02aa8a08cab8af5eb9" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.527645 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-zh8v6" Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.605582 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:42:05 crc kubenswrapper[4755]: I0320 13:42:05.614022 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-bp947"] Mar 20 13:42:06 crc kubenswrapper[4755]: I0320 13:42:06.751288 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:42:06 crc kubenswrapper[4755]: I0320 13:42:06.751395 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:07 crc kubenswrapper[4755]: I0320 13:42:07.235749 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8532b92f-bed9-41b0-bf0d-99afa5703048" path="/var/lib/kubelet/pods/8532b92f-bed9-41b0-bf0d-99afa5703048/volumes" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.751484 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.752573 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.752714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.753726 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:42:36 crc kubenswrapper[4755]: I0320 13:42:36.753846 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" gracePeriod=600 Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.770606 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" exitCode=0 Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.770692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08"} Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.771208 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} Mar 20 13:42:37 crc kubenswrapper[4755]: I0320 13:42:37.771234 4755 scope.go:117] "RemoveContainer" containerID="d343606a211eae61fa8567ec5f93ee97fb742d6b59f73d12ac2ce6a90c6bfaab" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.285435 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: E0320 13:42:51.286176 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286188 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286300 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="320783b7-7554-4157-b6cd-143d787dc30b" containerName="oc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.286713 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289023 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289717 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-pc8h7" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.289956 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.304820 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.310710 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.311420 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.313847 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-t6rjl" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.320386 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.320436 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.332034 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.337348 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.338321 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.342506 4755 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-pwksc" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.349137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.422123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.444170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4pk\" (UniqueName: \"kubernetes.io/projected/cdf5c938-39f0-46a4-bce6-1a0cf67624ab-kube-api-access-nb4pk\") pod \"cert-manager-858654f9db-7gpgn\" (UID: \"cdf5c938-39f0-46a4-bce6-1a0cf67624ab\") " pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.451289 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwzvj\" (UniqueName: \"kubernetes.io/projected/a3125fba-bed9-40d3-b53d-f976488e12d2-kube-api-access-jwzvj\") pod \"cert-manager-cainjector-cf98fcc89-l955j\" (UID: \"a3125fba-bed9-40d3-b53d-f976488e12d2\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.523850 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.543513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbkn5\" (UniqueName: \"kubernetes.io/projected/f3b802e1-c690-4817-91cf-d721cbfae51c-kube-api-access-wbkn5\") pod \"cert-manager-webhook-687f57d79b-hbz2p\" (UID: \"f3b802e1-c690-4817-91cf-d721cbfae51c\") " pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.610610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.632896 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-7gpgn" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.660043 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.894213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-l955j"] Mar 20 13:42:51 crc kubenswrapper[4755]: I0320 13:42:51.931277 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-7gpgn"] Mar 20 13:42:51 crc kubenswrapper[4755]: W0320 13:42:51.932737 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdf5c938_39f0_46a4_bce6_1a0cf67624ab.slice/crio-86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea WatchSource:0}: Error finding container 86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea: Status 404 returned error can't find the container with id 86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.140730 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-hbz2p"] Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.898468 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7gpgn" event={"ID":"cdf5c938-39f0-46a4-bce6-1a0cf67624ab","Type":"ContainerStarted","Data":"86920e411e7274954e0e81a04d99b938fb5ad6c6099513b1773d4dc1f3eee5ea"} Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.900462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" event={"ID":"a3125fba-bed9-40d3-b53d-f976488e12d2","Type":"ContainerStarted","Data":"81ac5a23d30dd2baf4a9ed224ad9946bb30b65b3721cfd349352adbdc4615c64"} Mar 20 13:42:52 crc kubenswrapper[4755]: I0320 13:42:52.902035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" event={"ID":"f3b802e1-c690-4817-91cf-d721cbfae51c","Type":"ContainerStarted","Data":"28520456b43b76327fa3d35665c454163acb8ca94150f2a252c25d13e93b0e8b"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.923574 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-7gpgn" event={"ID":"cdf5c938-39f0-46a4-bce6-1a0cf67624ab","Type":"ContainerStarted","Data":"204e46f0d6136ec475dd6ae41242dbe77978089d71f7a352dbbd7f27b9df0ef3"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.925716 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" event={"ID":"a3125fba-bed9-40d3-b53d-f976488e12d2","Type":"ContainerStarted","Data":"1c07257661096886d39e7b410e72c58fd60c4e93df1eab568b720e3c54f848cb"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.927508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" event={"ID":"f3b802e1-c690-4817-91cf-d721cbfae51c","Type":"ContainerStarted","Data":"e238cdd2c82235f75eb7c47597f59cc4ca556ee48295f424497a94ef17e9b326"} Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.927736 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.969562 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-7gpgn" podStartSLOduration=1.39869907 podStartE2EDuration="4.969519952s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:51.934255129 +0000 UTC m=+751.532187658" lastFinishedPulling="2026-03-20 13:42:55.505076011 +0000 UTC m=+755.103008540" observedRunningTime="2026-03-20 13:42:55.944967822 +0000 UTC m=+755.542900381" watchObservedRunningTime="2026-03-20 13:42:55.969519952 +0000 UTC m=+755.567452521" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.972796 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-l955j" podStartSLOduration=1.375408532 podStartE2EDuration="4.972773421s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:51.907912559 +0000 UTC m=+751.505845088" lastFinishedPulling="2026-03-20 13:42:55.505277408 +0000 UTC m=+755.103209977" observedRunningTime="2026-03-20 13:42:55.966820218 +0000 UTC m=+755.564752757" watchObservedRunningTime="2026-03-20 13:42:55.972773421 +0000 UTC m=+755.570706000" Mar 20 13:42:55 crc kubenswrapper[4755]: I0320 13:42:55.983493 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" podStartSLOduration=1.563895821 podStartE2EDuration="4.983463123s" podCreationTimestamp="2026-03-20 13:42:51 +0000 UTC" firstStartedPulling="2026-03-20 13:42:52.149888277 +0000 UTC m=+751.747820806" lastFinishedPulling="2026-03-20 13:42:55.569455539 +0000 UTC m=+755.167388108" observedRunningTime="2026-03-20 13:42:55.981098358 +0000 UTC m=+755.579030897" watchObservedRunningTime="2026-03-20 13:42:55.983463123 +0000 UTC m=+755.581395652" Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.984254 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985481 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" containerID="cri-o://aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985531 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" containerID="cri-o://985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985601 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985809 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" containerID="cri-o://f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985887 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" containerID="cri-o://e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.985979 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" containerID="cri-o://5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" gracePeriod=30 Mar 20 13:43:00 crc kubenswrapper[4755]: I0320 13:43:00.986061 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" containerID="cri-o://4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" gracePeriod=30 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.053175 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" containerID="cri-o://e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" gracePeriod=30 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.347117 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.349870 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-acl-logging/0.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.350463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-controller/0.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.351064 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.433071 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-dgqtf"] Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.434061 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.434269 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.434430 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.434589 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.435227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435400 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.435598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.435818 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.436176 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.436348 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.436887 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.436976 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437056 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437156 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437227 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437295 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kubecfg-setup" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437357 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kubecfg-setup" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437424 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437713 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.437793 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.437888 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438140 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438229 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438304 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="northd" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438374 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="sbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438447 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438514 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="kube-rbac-proxy-node" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438577 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438642 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-acl-logging" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438748 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="nbdb" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.438822 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovn-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.439117 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439207 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439408 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.439796 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerName="ovnkube-controller" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.441960 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473804 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473869 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.473968 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474114 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474520 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474515 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474598 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474628 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474646 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474687 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash" (OuterVolumeSpecName: "host-slash") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474702 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474755 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474774 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474809 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474829 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474844 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") pod \"e0de398a-6f32-4b1c-a840-10ff45da7251\" (UID: \"e0de398a-6f32-4b1c-a840-10ff45da7251\") " Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474908 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474932 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474952 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474972 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.474993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475026 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475104 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475107 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475126 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475182 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475222 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475257 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475281 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475295 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475356 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475367 4755 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475375 4755 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475384 4755 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475393 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475402 4755 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475410 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475165 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475188 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475211 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475439 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475564 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475585 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log" (OuterVolumeSpecName: "node-log") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.475610 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket" (OuterVolumeSpecName: "log-socket") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.477383 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.479598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.486519 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.487618 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b" (OuterVolumeSpecName: "kube-api-access-jlq8b") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "kube-api-access-jlq8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.498168 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "e0de398a-6f32-4b1c-a840-10ff45da7251" (UID: "e0de398a-6f32-4b1c-a840-10ff45da7251"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576309 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-systemd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-node-log\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576473 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576685 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576884 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.576952 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-config\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577189 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-log-socket\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577352 4755 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577374 4755 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577391 4755 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577409 4755 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577428 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlq8b\" (UniqueName: \"kubernetes.io/projected/e0de398a-6f32-4b1c-a840-10ff45da7251-kube-api-access-jlq8b\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577444 4755 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577462 4755 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577479 4755 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577497 4755 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577513 4755 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0de398a-6f32-4b1c-a840-10ff45da7251-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577532 4755 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577551 4755 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0de398a-6f32-4b1c-a840-10ff45da7251-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577567 4755 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0de398a-6f32-4b1c-a840-10ff45da7251-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-run-ovn\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577620 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-env-overrides\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577682 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-systemd-units\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577723 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-var-lib-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-netd\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577763 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577806 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-ovn-kubernetes\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577807 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-etc-openvswitch\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577850 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-kubelet\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.577853 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-slash\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-cni-bin\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578061 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9e7d5628-1936-4039-86ee-97de2cf80ad6-host-run-netns\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.578452 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovnkube-script-lib\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.584716 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9e7d5628-1936-4039-86ee-97de2cf80ad6-ovn-node-metrics-cert\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.601633 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsr4k\" (UniqueName: \"kubernetes.io/projected/9e7d5628-1936-4039-86ee-97de2cf80ad6-kube-api-access-tsr4k\") pod \"ovnkube-node-dgqtf\" (UID: \"9e7d5628-1936-4039-86ee-97de2cf80ad6\") " pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.665998 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-hbz2p" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.760464 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990463 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7d5628-1936-4039-86ee-97de2cf80ad6" containerID="8093898e5298d04a4cfbe84857e7a5d3b869d75ee6e76431d6f3187ef1e83f01" exitCode=0 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990555 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerDied","Data":"8093898e5298d04a4cfbe84857e7a5d3b869d75ee6e76431d6f3187ef1e83f01"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.990619 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"5d41d05dc730594ae7f0ab6ae8dc2d4fe89e05bfb580d459973d1cb0d08179c7"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.993950 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996642 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/1.log" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996743 4755 generic.go:334] "Generic (PLEG): container finished" podID="e5ba4f17-8c41-4124-b563-01d5f1751139" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" exitCode=2 Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996856 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerDied","Data":"1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552"} Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.996960 4755 scope.go:117] "RemoveContainer" containerID="cc57f57c90501469513cee3bca2bc8c58f06f1d5e1416cb7fb35a72ef38950fc" Mar 20 13:43:01 crc kubenswrapper[4755]: I0320 13:43:01.998318 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:01 crc kubenswrapper[4755]: E0320 13:43:01.998766 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.002832 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovnkube-controller/3.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.010874 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-acl-logging/0.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.011595 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bd25w_e0de398a-6f32-4b1c-a840-10ff45da7251/ovn-controller/0.log" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013578 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013616 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013625 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013636 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013631 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013648 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013710 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" exitCode=0 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013724 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" exitCode=143 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013735 4755 generic.go:334] "Generic (PLEG): container finished" podID="e0de398a-6f32-4b1c-a840-10ff45da7251" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" exitCode=143 Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013754 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013772 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013787 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013801 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013811 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013824 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013838 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013844 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013854 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013860 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013865 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013871 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013876 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013881 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013886 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013893 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013901 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013910 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013915 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013920 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013926 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013931 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013831 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.013937 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014064 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014081 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014089 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014117 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014144 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014153 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014161 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014168 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014176 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014184 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014191 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014199 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014206 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014215 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bd25w" event={"ID":"e0de398a-6f32-4b1c-a840-10ff45da7251","Type":"ContainerDied","Data":"c5dfe5e0ba9e4e073084c039346a869cdace2560fac63c02b23de7cad0ed5e4a"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014237 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014247 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014254 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014264 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014271 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014278 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014286 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014292 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014299 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.014306 4755 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.050582 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.068874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.075270 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bd25w"] Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.079195 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.105626 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.123315 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.180309 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.209206 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.251073 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.272905 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.290348 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.304978 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.324198 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.325083 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.325232 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.325318 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.326185 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326262 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326334 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.326918 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.326992 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327042 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.327780 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327829 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.327858 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.328271 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328353 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328496 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.328941 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.328994 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329028 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.329503 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329575 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.329600 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.330311 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.330349 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.330374 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.331296 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331336 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331364 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: E0320 13:43:02.331732 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331810 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.331915 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.332729 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.332769 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333288 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333344 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333771 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.333807 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334285 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334913 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.334938 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.335788 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.335825 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336338 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336366 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336810 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.336840 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.337400 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.337423 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338346 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338388 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338883 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.338911 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.339225 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.339268 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340070 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340096 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340511 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.340623 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341277 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341319 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341770 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.341801 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342278 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342322 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.342935 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.343008 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344116 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344148 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344490 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.344519 4755 scope.go:117] "RemoveContainer" containerID="e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.345466 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054"} err="failed to get container status \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": rpc error: code = NotFound desc = could not find container \"e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054\": container with ID starting with e3a521948aaef8df8a240c81ba9fbed18c0917d8ed6bf0e2f17e5b27bde7e054 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.345505 4755 scope.go:117] "RemoveContainer" containerID="08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346025 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927"} err="failed to get container status \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": rpc error: code = NotFound desc = could not find container \"08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927\": container with ID starting with 08ff7017c134f8a6c9dc850db0a200068381e96a97bd6c1c0aa7584210eec927 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346053 4755 scope.go:117] "RemoveContainer" containerID="4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346708 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c"} err="failed to get container status \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": rpc error: code = NotFound desc = could not find container \"4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c\": container with ID starting with 4eef234b00eb2c85e30b1ddf9b13a1701d65d6e97f5c13944a6a817b8171108c not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.346738 4755 scope.go:117] "RemoveContainer" containerID="985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347178 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4"} err="failed to get container status \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": rpc error: code = NotFound desc = could not find container \"985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4\": container with ID starting with 985d918507a47bee448ca706de6b77102747aac58f3f997b459fd8c0a5cc46c4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347231 4755 scope.go:117] "RemoveContainer" containerID="f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347501 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23"} err="failed to get container status \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": rpc error: code = NotFound desc = could not find container \"f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23\": container with ID starting with f9899b533cf66b64f04534621c9b1fc380f13539e49cfc8d009e9debddcb7e23 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347526 4755 scope.go:117] "RemoveContainer" containerID="02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347802 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416"} err="failed to get container status \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": rpc error: code = NotFound desc = could not find container \"02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416\": container with ID starting with 02cbaf9879fdfaf46840e6e5ef044c30cf4e66262f264c241f468ef336c7c416 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.347827 4755 scope.go:117] "RemoveContainer" containerID="e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348069 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1"} err="failed to get container status \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": rpc error: code = NotFound desc = could not find container \"e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1\": container with ID starting with e489e88d172337cde70987f7b4e7621debb94ee79c6785df586a511cf221b1f1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348092 4755 scope.go:117] "RemoveContainer" containerID="5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348348 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1"} err="failed to get container status \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": rpc error: code = NotFound desc = could not find container \"5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1\": container with ID starting with 5ff083eb3d6f6e0fc232e86eebdf22232e947d0be2e45c916b38f83136eb3ca1 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348375 4755 scope.go:117] "RemoveContainer" containerID="aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348848 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4"} err="failed to get container status \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": rpc error: code = NotFound desc = could not find container \"aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4\": container with ID starting with aa820e0be38b66899628ff95eeec9f85522d7fb0b78dfdcb80d2c47dc1223ca4 not found: ID does not exist" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.348872 4755 scope.go:117] "RemoveContainer" containerID="d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68" Mar 20 13:43:02 crc kubenswrapper[4755]: I0320 13:43:02.349222 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68"} err="failed to get container status \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": rpc error: code = NotFound desc = could not find container \"d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68\": container with ID starting with d5c50c59aafc4f27a01fd405ebb1e8b974816b330608d68af4c1beb317fb3d68 not found: ID does not exist" Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"eaa68f4520836a4aa1778f4602112733625a44814fbca559bce5631d00337cf0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021570 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"4d265715dc08d2eed66fade7a098da657c8066df17249cd01550a7f350e1bcde"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021590 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"b35b9e9adbb3c6a19448f21171b09caca902d53d0ec07f326d04aa50976acf43"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"ff39e035be42fdc2b822e029f26c27a557fa07db1d50f323d4994c26e4437030"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"dcafc946601b25fdb92ca9369d3269fbfca344a83bbaac7be5d0e25dd62d1cf0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.021638 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"13304c8e375660819ed03ac2a12bd87812f675185fb96acf6676a6efc651baa0"} Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.022735 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:03 crc kubenswrapper[4755]: I0320 13:43:03.239220 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0de398a-6f32-4b1c-a840-10ff45da7251" path="/var/lib/kubelet/pods/e0de398a-6f32-4b1c-a840-10ff45da7251/volumes" Mar 20 13:43:06 crc kubenswrapper[4755]: I0320 13:43:06.056190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"b7bac5a98f64044d181da2907d2af7f15127a4bd13e1dec50169e6fa1db2f2fd"} Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.091231 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" event={"ID":"9e7d5628-1936-4039-86ee-97de2cf80ad6","Type":"ContainerStarted","Data":"a1627074f67d849a42402a3cc17a04efa9127c57322c6c903d4f2e418496bdfb"} Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092035 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.092044 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.123581 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" podStartSLOduration=7.123559901 podStartE2EDuration="7.123559901s" podCreationTimestamp="2026-03-20 13:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:08.11804735 +0000 UTC m=+767.715979899" watchObservedRunningTime="2026-03-20 13:43:08.123559901 +0000 UTC m=+767.721492430" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.126277 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:08 crc kubenswrapper[4755]: I0320 13:43:08.127547 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:15 crc kubenswrapper[4755]: I0320 13:43:15.226062 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:15 crc kubenswrapper[4755]: E0320 13:43:15.227077 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-8btvn_openshift-multus(e5ba4f17-8c41-4124-b563-01d5f1751139)\"" pod="openshift-multus/multus-8btvn" podUID="e5ba4f17-8c41-4124-b563-01d5f1751139" Mar 20 13:43:22 crc kubenswrapper[4755]: I0320 13:43:22.238697 4755 scope.go:117] "RemoveContainer" containerID="48fedef7d2253c830a250936f751690b6a7ff3c3f6839674f960627f11642a63" Mar 20 13:43:26 crc kubenswrapper[4755]: I0320 13:43:26.225470 4755 scope.go:117] "RemoveContainer" containerID="1a5be238c9b55e38ba503f22f1f6413892abacc16e48e85c4392f869da964552" Mar 20 13:43:27 crc kubenswrapper[4755]: I0320 13:43:27.237049 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8btvn_e5ba4f17-8c41-4124-b563-01d5f1751139/kube-multus/2.log" Mar 20 13:43:27 crc kubenswrapper[4755]: I0320 13:43:27.237737 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8btvn" event={"ID":"e5ba4f17-8c41-4124-b563-01d5f1751139","Type":"ContainerStarted","Data":"f2b63f719c6f77c2b644c8e74f4be0a1dd2a972d78a6d1db6619be3ae9203011"} Mar 20 13:43:31 crc kubenswrapper[4755]: I0320 13:43:31.790699 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-dgqtf" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.568600 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.571125 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.575007 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.585621 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715254 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.715309 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817192 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.817342 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.818029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.818122 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.845511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:41 crc kubenswrapper[4755]: I0320 13:43:41.955096 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.212972 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv"] Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.351428 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerStarted","Data":"9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21"} Mar 20 13:43:42 crc kubenswrapper[4755]: I0320 13:43:42.351506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerStarted","Data":"71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a"} Mar 20 13:43:43 crc kubenswrapper[4755]: I0320 13:43:43.358035 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21" exitCode=0 Mar 20 13:43:43 crc kubenswrapper[4755]: I0320 13:43:43.358094 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"9ee672984c4ceb2bcc2f1396988317448c7afc99ee3ca0d33a55ca0362030e21"} Mar 20 13:43:46 crc kubenswrapper[4755]: I0320 13:43:46.381132 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="607b709b941a51438e8f0f347993c872dd32eb89b9b069b9bbe07168c2adce9a" exitCode=0 Mar 20 13:43:46 crc kubenswrapper[4755]: I0320 13:43:46.381200 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"607b709b941a51438e8f0f347993c872dd32eb89b9b069b9bbe07168c2adce9a"} Mar 20 13:43:47 crc kubenswrapper[4755]: I0320 13:43:47.394036 4755 generic.go:334] "Generic (PLEG): container finished" podID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerID="f51e02d9a7299d98f804d8510afd259a39e50bf730e4bab151a53197c6e45525" exitCode=0 Mar 20 13:43:47 crc kubenswrapper[4755]: I0320 13:43:47.394158 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"f51e02d9a7299d98f804d8510afd259a39e50bf730e4bab151a53197c6e45525"} Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.713427 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.815815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.816078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.816274 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") pod \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\" (UID: \"73d92c92-af26-4aa9-a774-04a1ef37b3c7\") " Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.817591 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle" (OuterVolumeSpecName: "bundle") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.824838 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm" (OuterVolumeSpecName: "kube-api-access-bxljm") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "kube-api-access-bxljm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.829444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util" (OuterVolumeSpecName: "util") pod "73d92c92-af26-4aa9-a774-04a1ef37b3c7" (UID: "73d92c92-af26-4aa9-a774-04a1ef37b3c7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918688 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918751 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/73d92c92-af26-4aa9-a774-04a1ef37b3c7-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:48 crc kubenswrapper[4755]: I0320 13:43:48.918773 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxljm\" (UniqueName: \"kubernetes.io/projected/73d92c92-af26-4aa9-a774-04a1ef37b3c7-kube-api-access-bxljm\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" event={"ID":"73d92c92-af26-4aa9-a774-04a1ef37b3c7","Type":"ContainerDied","Data":"71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a"} Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412065 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv" Mar 20 13:43:49 crc kubenswrapper[4755]: I0320 13:43:49.412093 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71f0d4635a5d7118a7fe912308b02cb08940fabfc621ac2d49f0b239e72fc58a" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.223943 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224646 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224677 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224704 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="pull" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224710 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="pull" Mar 20 13:43:51 crc kubenswrapper[4755]: E0320 13:43:51.224720 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="util" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224726 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="util" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.224838 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="73d92c92-af26-4aa9-a774-04a1ef37b3c7" containerName="extract" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.225440 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.233033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-g9tlc" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.235860 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.236685 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.238411 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.357233 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.458627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.487026 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7lkx\" (UniqueName: \"kubernetes.io/projected/93adf7be-d696-48e2-b6d5-af27b19b24e3-kube-api-access-d7lkx\") pod \"nmstate-operator-796d4cfff4-rz567\" (UID: \"93adf7be-d696-48e2-b6d5-af27b19b24e3\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.546742 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" Mar 20 13:43:51 crc kubenswrapper[4755]: I0320 13:43:51.868988 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-rz567"] Mar 20 13:43:52 crc kubenswrapper[4755]: I0320 13:43:52.436194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" event={"ID":"93adf7be-d696-48e2-b6d5-af27b19b24e3","Type":"ContainerStarted","Data":"acaa4643d8798df54cd0b351bd4120eb64693e4fe05b362ad5b47b4cbf3793d2"} Mar 20 13:43:54 crc kubenswrapper[4755]: I0320 13:43:54.449442 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" event={"ID":"93adf7be-d696-48e2-b6d5-af27b19b24e3","Type":"ContainerStarted","Data":"c9d720dc57f1ab407eb947def07c09c11b539e171ec406154f73dcaf7d0ffe53"} Mar 20 13:43:54 crc kubenswrapper[4755]: I0320 13:43:54.476960 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-rz567" podStartSLOduration=1.172308329 podStartE2EDuration="3.476933319s" podCreationTimestamp="2026-03-20 13:43:51 +0000 UTC" firstStartedPulling="2026-03-20 13:43:51.881813893 +0000 UTC m=+811.479746422" lastFinishedPulling="2026-03-20 13:43:54.186438883 +0000 UTC m=+813.784371412" observedRunningTime="2026-03-20 13:43:54.473893026 +0000 UTC m=+814.071825565" watchObservedRunningTime="2026-03-20 13:43:54.476933319 +0000 UTC m=+814.074865878" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.145425 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.146970 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.149893 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.150295 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.150728 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.160450 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.185572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.287530 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.321633 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"auto-csr-approver-29566904-2crfj\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.505196 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:00 crc kubenswrapper[4755]: I0320 13:44:00.941289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.296293 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.300936 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.306460 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-d78x9" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.315137 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.336710 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.337875 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.341754 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.350015 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.364909 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-dspfd"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.365967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404130 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404236 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404279 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404299 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.404323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.442067 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.443027 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.447047 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.447364 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.448332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-vwnxs" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.456802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.502366 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerStarted","Data":"8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee"} Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505168 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505234 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505256 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505274 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505316 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505345 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505378 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505407 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505359 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-nmstate-lock\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-ovs-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.505636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-dbus-socket\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.521453 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.521469 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz6bw\" (UniqueName: \"kubernetes.io/projected/2e4b8ce9-115c-4c39-9f1b-a5681ded9b68-kube-api-access-qz6bw\") pod \"nmstate-handler-dspfd\" (UID: \"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68\") " pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.522527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmhf7\" (UniqueName: \"kubernetes.io/projected/36f8cd57-a5ee-4a30-b7b6-8f13d698861c-kube-api-access-xmhf7\") pod \"nmstate-webhook-5f558f5558-72787\" (UID: \"36f8cd57-a5ee-4a30-b7b6-8f13d698861c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.531013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clht\" (UniqueName: \"kubernetes.io/projected/284e4beb-7815-41fc-ac59-95ed647c0d7c-kube-api-access-5clht\") pod \"nmstate-metrics-9b8c8685d-68g6g\" (UID: \"284e4beb-7815-41fc-ac59-95ed647c0d7c\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606180 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606565 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.606584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.607823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a9993046-1fc7-4faa-a634-f91339d94c71-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.611215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9993046-1fc7-4faa-a634-f91339d94c71-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.627756 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnrqq\" (UniqueName: \"kubernetes.io/projected/a9993046-1fc7-4faa-a634-f91339d94c71-kube-api-access-hnrqq\") pod \"nmstate-console-plugin-86f58fcf4-2hkvk\" (UID: \"a9993046-1fc7-4faa-a634-f91339d94c71\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.647118 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.663555 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.663590 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.664686 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.675775 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.679335 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708828 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708860 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.708951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.710217 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.766297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.811712 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.811996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812016 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812056 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812076 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.812098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.813483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-service-ca\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.814608 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-trusted-ca-bundle\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.815170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.815418 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-oauth-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.821129 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-oauth-config\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.822382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-console-serving-cert\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.834804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pq4v\" (UniqueName: \"kubernetes.io/projected/9fdb87ca-2790-4a05-8438-d3b5ae3b78da-kube-api-access-6pq4v\") pod \"console-6d8f88c4cc-78flh\" (UID: \"9fdb87ca-2790-4a05-8438-d3b5ae3b78da\") " pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.948401 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-72787"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.986599 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g"] Mar 20 13:44:01 crc kubenswrapper[4755]: I0320 13:44:01.990508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.073070 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk"] Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.201175 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d8f88c4cc-78flh"] Mar 20 13:44:02 crc kubenswrapper[4755]: W0320 13:44:02.209095 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fdb87ca_2790_4a05_8438_d3b5ae3b78da.slice/crio-a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82 WatchSource:0}: Error finding container a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82: Status 404 returned error can't find the container with id a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82 Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.509613 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8f88c4cc-78flh" event={"ID":"9fdb87ca-2790-4a05-8438-d3b5ae3b78da","Type":"ContainerStarted","Data":"b0ba923a011fb330fc1f9b322df9a726e62a33c31eb8cb98c80655a2e0a6fb99"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.509719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d8f88c4cc-78flh" event={"ID":"9fdb87ca-2790-4a05-8438-d3b5ae3b78da","Type":"ContainerStarted","Data":"a378dbfdf6d069df9f6e608e3558f5ec62b3e9a7f7ec76deae8d97ac0f5a8e82"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.510896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dspfd" event={"ID":"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68","Type":"ContainerStarted","Data":"53eb56592f8c80638a844cf0f7c7e67f21a4aadcafcd03edcc772ed58ef83739"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.512735 4755 generic.go:334] "Generic (PLEG): container finished" podID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerID="9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43" exitCode=0 Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.512835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerDied","Data":"9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.513617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"bef119c37e5ad93b35f2a2adc96dd82a2608d0fe70e2a5eeeb98ec9d894fc009"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.514247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" event={"ID":"a9993046-1fc7-4faa-a634-f91339d94c71","Type":"ContainerStarted","Data":"6bef5544c50e2b2dd9a459e7c413b33879930338956139addf3cddb53d6f1617"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.515532 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" event={"ID":"36f8cd57-a5ee-4a30-b7b6-8f13d698861c","Type":"ContainerStarted","Data":"91116addc9df5afeb945720384912e548f8a1317630820f06de6e469d3480d66"} Mar 20 13:44:02 crc kubenswrapper[4755]: I0320 13:44:02.532166 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d8f88c4cc-78flh" podStartSLOduration=1.532146005 podStartE2EDuration="1.532146005s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:02.532027442 +0000 UTC m=+822.129959981" watchObservedRunningTime="2026-03-20 13:44:02.532146005 +0000 UTC m=+822.130078534" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.759211 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.839855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") pod \"719824b6-7bd2-41dc-a61f-039b161a94d6\" (UID: \"719824b6-7bd2-41dc-a61f-039b161a94d6\") " Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.864609 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb" (OuterVolumeSpecName: "kube-api-access-6s7wb") pod "719824b6-7bd2-41dc-a61f-039b161a94d6" (UID: "719824b6-7bd2-41dc-a61f-039b161a94d6"). InnerVolumeSpecName "kube-api-access-6s7wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:03 crc kubenswrapper[4755]: I0320 13:44:03.941905 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s7wb\" (UniqueName: \"kubernetes.io/projected/719824b6-7bd2-41dc-a61f-039b161a94d6-kube-api-access-6s7wb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529218 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-2crfj" event={"ID":"719824b6-7bd2-41dc-a61f-039b161a94d6","Type":"ContainerDied","Data":"8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee"} Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529260 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8be9a05a5630d65833439e7bb6c8bfa3ea50a771dc4b26d3583959dcd613b0ee" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.529271 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-2crfj" Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.819946 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:44:04 crc kubenswrapper[4755]: I0320 13:44:04.825378 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-pbw9z"] Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.236262 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb576c19-7f49-40ac-987b-5eefb5db31ce" path="/var/lib/kubelet/pods/cb576c19-7f49-40ac-987b-5eefb5db31ce/volumes" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.538278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"add667afab7a32c46e2d27f11ae0d50f887e086ed1ee76fe95f59029e791f567"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.540311 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" event={"ID":"a9993046-1fc7-4faa-a634-f91339d94c71","Type":"ContainerStarted","Data":"f950d904399f59a00448fd1b2d442e0f328b7ab728925a2ecbcff05f89c11cef"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.544182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" event={"ID":"36f8cd57-a5ee-4a30-b7b6-8f13d698861c","Type":"ContainerStarted","Data":"b8777256907f0e08b2fab5947add70027e568375d5e358a7f1fd87e1d67c2b58"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.544301 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.547877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-dspfd" event={"ID":"2e4b8ce9-115c-4c39-9f1b-a5681ded9b68","Type":"ContainerStarted","Data":"83cbde2e0e11c48350d4dc1a0f1d0dfac6682f6b3b81fac89ef0efa3c0623a5d"} Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.548341 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.561871 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-2hkvk" podStartSLOduration=1.725631763 podStartE2EDuration="4.561848655s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:02.08208399 +0000 UTC m=+821.680016519" lastFinishedPulling="2026-03-20 13:44:04.918300882 +0000 UTC m=+824.516233411" observedRunningTime="2026-03-20 13:44:05.559010588 +0000 UTC m=+825.156943127" watchObservedRunningTime="2026-03-20 13:44:05.561848655 +0000 UTC m=+825.159781194" Mar 20 13:44:05 crc kubenswrapper[4755]: I0320 13:44:05.589137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-dspfd" podStartSLOduration=1.4071196160000001 podStartE2EDuration="4.589116187s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:01.73262766 +0000 UTC m=+821.330560189" lastFinishedPulling="2026-03-20 13:44:04.914624231 +0000 UTC m=+824.512556760" observedRunningTime="2026-03-20 13:44:05.588418748 +0000 UTC m=+825.186351277" watchObservedRunningTime="2026-03-20 13:44:05.589116187 +0000 UTC m=+825.187048716" Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.585829 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" event={"ID":"284e4beb-7815-41fc-ac59-95ed647c0d7c","Type":"ContainerStarted","Data":"79a786ab22b7080c5290cefa0648f56126000f30c57dd2b8c2a8ffbe8488840d"} Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.615195 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-68g6g" podStartSLOduration=1.6604807240000001 podStartE2EDuration="8.615156862s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:02.002816726 +0000 UTC m=+821.600749255" lastFinishedPulling="2026-03-20 13:44:08.957492864 +0000 UTC m=+828.555425393" observedRunningTime="2026-03-20 13:44:09.61101175 +0000 UTC m=+829.208944349" watchObservedRunningTime="2026-03-20 13:44:09.615156862 +0000 UTC m=+829.213089441" Mar 20 13:44:09 crc kubenswrapper[4755]: I0320 13:44:09.616121 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" podStartSLOduration=5.65377989 podStartE2EDuration="8.616104688s" podCreationTimestamp="2026-03-20 13:44:01 +0000 UTC" firstStartedPulling="2026-03-20 13:44:01.966771326 +0000 UTC m=+821.564703865" lastFinishedPulling="2026-03-20 13:44:04.929096134 +0000 UTC m=+824.527028663" observedRunningTime="2026-03-20 13:44:05.616953944 +0000 UTC m=+825.214886483" watchObservedRunningTime="2026-03-20 13:44:09.616104688 +0000 UTC m=+829.214037257" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.716910 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-dspfd" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.991796 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:11 crc kubenswrapper[4755]: I0320 13:44:11.991902 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.000220 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.611637 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6d8f88c4cc-78flh" Mar 20 13:44:12 crc kubenswrapper[4755]: I0320 13:44:12.681499 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:21 crc kubenswrapper[4755]: I0320 13:44:21.671705 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-72787" Mar 20 13:44:22 crc kubenswrapper[4755]: I0320 13:44:22.340928 4755 scope.go:117] "RemoveContainer" containerID="0475ddf40f1f946ce60b2db15f62182a6200adfb2c95b60479c44432cfa187cc" Mar 20 13:44:29 crc kubenswrapper[4755]: I0320 13:44:29.238442 4755 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.596092 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:37 crc kubenswrapper[4755]: E0320 13:44:37.597292 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.597311 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.597468 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" containerName="oc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.598682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.603157 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.610182 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.753996 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-rb5zn" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" containerID="cri-o://664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" gracePeriod=15 Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770038 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.770151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.898997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.899130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.899245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.900857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.901253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:37 crc kubenswrapper[4755]: I0320 13:44:37.937606 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.232633 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.242823 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb5zn_27405a42-41b4-4521-93f3-41d029fab255/console/0.log" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.242905 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313775 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313849 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.313871 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314022 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314130 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.314167 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") pod \"27405a42-41b4-4521-93f3-41d029fab255\" (UID: \"27405a42-41b4-4521-93f3-41d029fab255\") " Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.315749 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca" (OuterVolumeSpecName: "service-ca") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.316114 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config" (OuterVolumeSpecName: "console-config") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.319366 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.320457 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.321006 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx" (OuterVolumeSpecName: "kube-api-access-b95lx") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "kube-api-access-b95lx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.327368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.328057 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "27405a42-41b4-4521-93f3-41d029fab255" (UID: "27405a42-41b4-4521-93f3-41d029fab255"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415887 4755 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415947 4755 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415962 4755 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/27405a42-41b4-4521-93f3-41d029fab255-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415971 4755 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.415998 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b95lx\" (UniqueName: \"kubernetes.io/projected/27405a42-41b4-4521-93f3-41d029fab255-kube-api-access-b95lx\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.416007 4755 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.416016 4755 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27405a42-41b4-4521-93f3-41d029fab255-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.493180 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl"] Mar 20 13:44:38 crc kubenswrapper[4755]: W0320 13:44:38.495151 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e8346da_4c59_4f8f_9804_02ad176bc15d.slice/crio-3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40 WatchSource:0}: Error finding container 3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40: Status 404 returned error can't find the container with id 3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845042 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-rb5zn_27405a42-41b4-4521-93f3-41d029fab255/console/0.log" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845119 4755 generic.go:334] "Generic (PLEG): container finished" podID="27405a42-41b4-4521-93f3-41d029fab255" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" exitCode=2 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845221 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerDied","Data":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845234 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rb5zn" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rb5zn" event={"ID":"27405a42-41b4-4521-93f3-41d029fab255","Type":"ContainerDied","Data":"a883711492469aba5080025f39ee56d456d68c7d62a0b2da2289bad36e4ed8ea"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.845290 4755 scope.go:117] "RemoveContainer" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852397 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="b0a79f5349b8f36bb4e504654c5686b5e5f4facc31a9a186a8ef5a0b760f88d5" exitCode=0 Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852460 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"b0a79f5349b8f36bb4e504654c5686b5e5f4facc31a9a186a8ef5a0b760f88d5"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.852503 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerStarted","Data":"3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40"} Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.866944 4755 scope.go:117] "RemoveContainer" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: E0320 13:44:38.872492 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": container with ID starting with 664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea not found: ID does not exist" containerID="664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.872595 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea"} err="failed to get container status \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": rpc error: code = NotFound desc = could not find container \"664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea\": container with ID starting with 664772756b7965ec3191f2f8c79b109f78861d78da2709998697c2ac7ebb68ea not found: ID does not exist" Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.908874 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:38 crc kubenswrapper[4755]: I0320 13:44:38.916517 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-rb5zn"] Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.238644 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27405a42-41b4-4521-93f3-41d029fab255" path="/var/lib/kubelet/pods/27405a42-41b4-4521-93f3-41d029fab255/volumes" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.930985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:39 crc kubenswrapper[4755]: E0320 13:44:39.931689 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.931709 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.931946 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="27405a42-41b4-4521-93f3-41d029fab255" containerName="console" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.933275 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:39 crc kubenswrapper[4755]: I0320 13:44:39.961680 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.045258 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.146985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.147305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.147522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.173028 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"redhat-operators-v8mxq\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.271811 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.528101 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:44:40 crc kubenswrapper[4755]: W0320 13:44:40.539600 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae13042d_ead5_4853_8f3e_cc16f6b3515f.slice/crio-b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d WatchSource:0}: Error finding container b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d: Status 404 returned error can't find the container with id b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872292 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872352 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547"} Mar 20 13:44:40 crc kubenswrapper[4755]: I0320 13:44:40.872426 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d"} Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.887228 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="10e74951c5cfdeb46b551d50b4326ea04ba813a6a09c3fcd46546f102ad9b3a6" exitCode=0 Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.887297 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"10e74951c5cfdeb46b551d50b4326ea04ba813a6a09c3fcd46546f102ad9b3a6"} Mar 20 13:44:42 crc kubenswrapper[4755]: I0320 13:44:42.902113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.914691 4755 generic.go:334] "Generic (PLEG): container finished" podID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerID="ff1a63dbaca1263c3bef126afa3f8efe75b50982f67c8485e65c31fdd4d68c3d" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.914781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"ff1a63dbaca1263c3bef126afa3f8efe75b50982f67c8485e65c31fdd4d68c3d"} Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.919842 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4755]: I0320 13:44:43.919932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} Mar 20 13:44:44 crc kubenswrapper[4755]: I0320 13:44:44.930689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerStarted","Data":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} Mar 20 13:44:44 crc kubenswrapper[4755]: I0320 13:44:44.957784 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v8mxq" podStartSLOduration=2.42720216 podStartE2EDuration="5.957749685s" podCreationTimestamp="2026-03-20 13:44:39 +0000 UTC" firstStartedPulling="2026-03-20 13:44:40.874345291 +0000 UTC m=+860.472277830" lastFinishedPulling="2026-03-20 13:44:44.404892806 +0000 UTC m=+864.002825355" observedRunningTime="2026-03-20 13:44:44.949912053 +0000 UTC m=+864.547844622" watchObservedRunningTime="2026-03-20 13:44:44.957749685 +0000 UTC m=+864.555682274" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.257633 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336601 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.336744 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") pod \"0e8346da-4c59-4f8f-9804-02ad176bc15d\" (UID: \"0e8346da-4c59-4f8f-9804-02ad176bc15d\") " Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.339965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle" (OuterVolumeSpecName: "bundle") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.341082 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.345288 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw" (OuterVolumeSpecName: "kube-api-access-7jhnw") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "kube-api-access-7jhnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.352885 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util" (OuterVolumeSpecName: "util") pod "0e8346da-4c59-4f8f-9804-02ad176bc15d" (UID: "0e8346da-4c59-4f8f-9804-02ad176bc15d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.443646 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0e8346da-4c59-4f8f-9804-02ad176bc15d-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.443756 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jhnw\" (UniqueName: \"kubernetes.io/projected/0e8346da-4c59-4f8f-9804-02ad176bc15d-kube-api-access-7jhnw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.939742 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" event={"ID":"0e8346da-4c59-4f8f-9804-02ad176bc15d","Type":"ContainerDied","Data":"3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40"} Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.940199 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f203cb156fd20d2d451bb767fab022c6fbf0e2bd115ec1ffa0e27d3f9e05e40" Mar 20 13:44:45 crc kubenswrapper[4755]: I0320 13:44:45.939810 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl" Mar 20 13:44:50 crc kubenswrapper[4755]: I0320 13:44:50.272747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:50 crc kubenswrapper[4755]: I0320 13:44:50.273191 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:44:51 crc kubenswrapper[4755]: I0320 13:44:51.367242 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v8mxq" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" probeResult="failure" output=< Mar 20 13:44:51 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:44:51 crc kubenswrapper[4755]: > Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.694619 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695394 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="util" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695410 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="util" Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695425 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="pull" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695432 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="pull" Mar 20 13:44:55 crc kubenswrapper[4755]: E0320 13:44:55.695449 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695457 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.695585 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e8346da-4c59-4f8f-9804-02ad176bc15d" containerName="extract" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.696100 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699337 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.699467 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.706735 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.706828 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9wmfx" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.720632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793347 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.793388 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894280 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894343 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.894432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.902169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-webhook-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.902188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/32289872-a679-4d10-8b2f-0519c713dc35-apiservice-cert\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:55 crc kubenswrapper[4755]: I0320 13:44:55.914135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2dnq\" (UniqueName: \"kubernetes.io/projected/32289872-a679-4d10-8b2f-0519c713dc35-kube-api-access-z2dnq\") pod \"metallb-operator-controller-manager-6ddbc48b88-k4d8p\" (UID: \"32289872-a679-4d10-8b2f-0519c713dc35\") " pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.014448 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.015249 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021304 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021149 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.021426 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bgjr9" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.036388 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097838 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.097922 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199043 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199487 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.199514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.204508 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-webhook-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.204566 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f0274fca-6425-402c-a2aa-853b232ad93c-apiservice-cert\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.215473 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666w6\" (UniqueName: \"kubernetes.io/projected/f0274fca-6425-402c-a2aa-853b232ad93c-kube-api-access-666w6\") pod \"metallb-operator-webhook-server-588c694cdc-8vjlb\" (UID: \"f0274fca-6425-402c-a2aa-853b232ad93c\") " pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.272588 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p"] Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.336459 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:44:56 crc kubenswrapper[4755]: I0320 13:44:56.571461 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb"] Mar 20 13:44:56 crc kubenswrapper[4755]: W0320 13:44:56.581935 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0274fca_6425_402c_a2aa_853b232ad93c.slice/crio-0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02 WatchSource:0}: Error finding container 0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02: Status 404 returned error can't find the container with id 0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02 Mar 20 13:44:57 crc kubenswrapper[4755]: I0320 13:44:57.016393 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" event={"ID":"f0274fca-6425-402c-a2aa-853b232ad93c","Type":"ContainerStarted","Data":"0c7463227b608b6e53cae7fe0e3a3995ede91d60bd151841b7d8c0d9fdc0bd02"} Mar 20 13:44:57 crc kubenswrapper[4755]: I0320 13:44:57.017974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" event={"ID":"32289872-a679-4d10-8b2f-0519c713dc35","Type":"ContainerStarted","Data":"9bc843492cee7820681aa9d684db343e507a5861f4fc13d90fbab9ba6d4f0e68"} Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.047139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" event={"ID":"32289872-a679-4d10-8b2f-0519c713dc35","Type":"ContainerStarted","Data":"69561a3396b227b22bee31a3e412678dee6bd7981c81da53433ca78e25ba850d"} Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.047722 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.071850 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" podStartSLOduration=1.60250275 podStartE2EDuration="5.071823171s" podCreationTimestamp="2026-03-20 13:44:55 +0000 UTC" firstStartedPulling="2026-03-20 13:44:56.279768656 +0000 UTC m=+875.877701185" lastFinishedPulling="2026-03-20 13:44:59.749089067 +0000 UTC m=+879.347021606" observedRunningTime="2026-03-20 13:45:00.066002623 +0000 UTC m=+879.663935142" watchObservedRunningTime="2026-03-20 13:45:00.071823171 +0000 UTC m=+879.669755700" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.137392 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.138797 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.141321 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.141668 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.150143 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.265411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.366846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.367446 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.388769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.393082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.426940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"collect-profiles-29566905-bsz49\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.459092 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.480753 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:00 crc kubenswrapper[4755]: I0320 13:45:00.703471 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49"] Mar 20 13:45:00 crc kubenswrapper[4755]: W0320 13:45:00.712836 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d412e5_5c09_4c6d_ba8d_db4546796c70.slice/crio-88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a WatchSource:0}: Error finding container 88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a: Status 404 returned error can't find the container with id 88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.055042 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerStarted","Data":"a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f"} Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.055119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerStarted","Data":"88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a"} Mar 20 13:45:01 crc kubenswrapper[4755]: I0320 13:45:01.274003 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" podStartSLOduration=1.27396798 podStartE2EDuration="1.27396798s" podCreationTimestamp="2026-03-20 13:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:01.076882322 +0000 UTC m=+880.674814851" watchObservedRunningTime="2026-03-20 13:45:01.27396798 +0000 UTC m=+880.871900549" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.063641 4755 generic.go:334] "Generic (PLEG): container finished" podID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerID="a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.063705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerDied","Data":"a1448ef33f49866d32c38e4909005daac13db09c22679974202f30e41c06028f"} Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.114930 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.115207 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v8mxq" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" containerID="cri-o://88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" gracePeriod=2 Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.532002 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.603939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.604074 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.604101 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") pod \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\" (UID: \"ae13042d-ead5-4853-8f3e-cc16f6b3515f\") " Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.605151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities" (OuterVolumeSpecName: "utilities") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.612884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr" (OuterVolumeSpecName: "kube-api-access-hq9hr") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "kube-api-access-hq9hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.705884 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.705936 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq9hr\" (UniqueName: \"kubernetes.io/projected/ae13042d-ead5-4853-8f3e-cc16f6b3515f-kube-api-access-hq9hr\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.733581 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae13042d-ead5-4853-8f3e-cc16f6b3515f" (UID: "ae13042d-ead5-4853-8f3e-cc16f6b3515f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4755]: I0320 13:45:02.807744 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae13042d-ead5-4853-8f3e-cc16f6b3515f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.075902 4755 generic.go:334] "Generic (PLEG): container finished" podID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" exitCode=0 Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076095 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v8mxq" event={"ID":"ae13042d-ead5-4853-8f3e-cc16f6b3515f","Type":"ContainerDied","Data":"b4ecc9d42703450c1286f5ca2f418668f97a75c0daea8b3f0734e7f52ebe0d0d"} Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076731 4755 scope.go:117] "RemoveContainer" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.076252 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v8mxq" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.116908 4755 scope.go:117] "RemoveContainer" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.134129 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.140444 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v8mxq"] Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.146923 4755 scope.go:117] "RemoveContainer" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166032 4755 scope.go:117] "RemoveContainer" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.166528 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": container with ID starting with 88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd not found: ID does not exist" containerID="88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166569 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd"} err="failed to get container status \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": rpc error: code = NotFound desc = could not find container \"88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd\": container with ID starting with 88d6b7f083b2168e20b75627ed656e189045f5da777d8ad987a99d74a3f6f9fd not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.166600 4755 scope.go:117] "RemoveContainer" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.167120 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": container with ID starting with d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9 not found: ID does not exist" containerID="d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167183 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9"} err="failed to get container status \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": rpc error: code = NotFound desc = could not find container \"d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9\": container with ID starting with d4b74e5e46bb50147f4c4180a42a4ad04fea6e5c68632eb4ad96fcfedaa62bd9 not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167227 4755 scope.go:117] "RemoveContainer" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: E0320 13:45:03.167640 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": container with ID starting with 15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547 not found: ID does not exist" containerID="15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.167745 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547"} err="failed to get container status \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": rpc error: code = NotFound desc = could not find container \"15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547\": container with ID starting with 15b9b110062283e3340da3100920ad0cf071d5e543461cd32c988fb6250d4547 not found: ID does not exist" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.237774 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" path="/var/lib/kubelet/pods/ae13042d-ead5-4853-8f3e-cc16f6b3515f/volumes" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.405047 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531119 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531158 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") pod \"94d412e5-5c09-4c6d-ba8d-db4546796c70\" (UID: \"94d412e5-5c09-4c6d-ba8d-db4546796c70\") " Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.531965 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume" (OuterVolumeSpecName: "config-volume") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.532267 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/94d412e5-5c09-4c6d-ba8d-db4546796c70-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.537171 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh" (OuterVolumeSpecName: "kube-api-access-fjqzh") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "kube-api-access-fjqzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.537224 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "94d412e5-5c09-4c6d-ba8d-db4546796c70" (UID: "94d412e5-5c09-4c6d-ba8d-db4546796c70"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.634427 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjqzh\" (UniqueName: \"kubernetes.io/projected/94d412e5-5c09-4c6d-ba8d-db4546796c70-kube-api-access-fjqzh\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:03 crc kubenswrapper[4755]: I0320 13:45:03.634491 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/94d412e5-5c09-4c6d-ba8d-db4546796c70-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087461 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" event={"ID":"94d412e5-5c09-4c6d-ba8d-db4546796c70","Type":"ContainerDied","Data":"88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a"} Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087516 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88905a795ecf93c3340a5c364b06d253d97642936b17be498a091ecc372c820a" Mar 20 13:45:04 crc kubenswrapper[4755]: I0320 13:45:04.087585 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-bsz49" Mar 20 13:45:06 crc kubenswrapper[4755]: I0320 13:45:06.751921 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:06 crc kubenswrapper[4755]: I0320 13:45:06.752560 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.109330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" event={"ID":"f0274fca-6425-402c-a2aa-853b232ad93c","Type":"ContainerStarted","Data":"b073f93f45a830615501a5b90f151f4ca77a77502554cec9ca5c5238bcb64a95"} Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.109676 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:45:07 crc kubenswrapper[4755]: I0320 13:45:07.143119 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" podStartSLOduration=2.271597668 podStartE2EDuration="12.143092748s" podCreationTimestamp="2026-03-20 13:44:55 +0000 UTC" firstStartedPulling="2026-03-20 13:44:56.585838876 +0000 UTC m=+876.183771405" lastFinishedPulling="2026-03-20 13:45:06.457333956 +0000 UTC m=+886.055266485" observedRunningTime="2026-03-20 13:45:07.135324527 +0000 UTC m=+886.733257076" watchObservedRunningTime="2026-03-20 13:45:07.143092748 +0000 UTC m=+886.741025277" Mar 20 13:45:16 crc kubenswrapper[4755]: I0320 13:45:16.342829 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-588c694cdc-8vjlb" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.025988 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6ddbc48b88-k4d8p" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.751359 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.751434 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836514 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-5l5hs"] Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836889 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-content" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836912 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-content" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836936 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836948 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.836974 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.836986 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: E0320 13:45:36.837008 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-utilities" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="extract-utilities" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837195 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="94d412e5-5c09-4c6d-ba8d-db4546796c70" containerName="collect-profiles" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.837222 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae13042d-ead5-4853-8f3e-cc16f6b3515f" containerName="registry-server" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.845680 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.845871 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.847404 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850033 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850742 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-qt2g7" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850087 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.850676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.858668 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943628 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943687 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943710 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943779 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943804 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.943833 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.973841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-6vf4n"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.974755 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983066 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983564 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983730 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.983894 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-fb2f6" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.993360 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.994326 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:36 crc kubenswrapper[4755]: I0320 13:45:36.999406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045585 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045617 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045668 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045698 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045768 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045881 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045904 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045948 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045969 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.045991 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.046282 4755 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.046372 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs podName:1152c78e-15f9-4826-acc3-3d7f5765db68 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:37.546353085 +0000 UTC m=+917.144285614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs") pod "frr-k8s-5l5hs" (UID: "1152c78e-15f9-4826-acc3-3d7f5765db68") : secret "frr-k8s-certs-secret" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.046480 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-sockets\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-startup\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.047306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-frr-conf\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.056520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.067006 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/1152c78e-15f9-4826-acc3-3d7f5765db68-reloader\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.076071 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxxfk\" (UniqueName: \"kubernetes.io/projected/1152c78e-15f9-4826-acc3-3d7f5765db68-kube-api-access-gxxfk\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.085423 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsqp5\" (UniqueName: \"kubernetes.io/projected/490ee5e7-c0b1-4181-b7ac-86e5e61253a0-kube-api-access-fsqp5\") pod \"frr-k8s-webhook-server-bcc4b6f68-7xgrp\" (UID: \"490ee5e7-c0b1-4181-b7ac-86e5e61253a0\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.089475 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147751 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147769 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.147803 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.147951 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.148007 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist podName:839a8db3-662c-41c4-bb63-6b1027901ab5 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:37.647988061 +0000 UTC m=+917.245920590 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist") pod "speaker-6vf4n" (UID: "839a8db3-662c-41c4-bb63-6b1027901ab5") : secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.154541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/839a8db3-662c-41c4-bb63-6b1027901ab5-metallb-excludel2\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.157033 4755 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.160908 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-cert\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.161161 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-metrics-certs\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.167432 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a71f1548-62b5-4a77-9655-735bafa396c8-metrics-certs\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.176526 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.177594 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blmrj\" (UniqueName: \"kubernetes.io/projected/a71f1548-62b5-4a77-9655-735bafa396c8-kube-api-access-blmrj\") pod \"controller-7bb4cc7c98-qsbbn\" (UID: \"a71f1548-62b5-4a77-9655-735bafa396c8\") " pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.188470 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp82l\" (UniqueName: \"kubernetes.io/projected/839a8db3-662c-41c4-bb63-6b1027901ab5-kube-api-access-xp82l\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.306502 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.557362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.565127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1152c78e-15f9-4826-acc3-3d7f5765db68-metrics-certs\") pod \"frr-k8s-5l5hs\" (UID: \"1152c78e-15f9-4826-acc3-3d7f5765db68\") " pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.625372 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp"] Mar 20 13:45:37 crc kubenswrapper[4755]: W0320 13:45:37.632962 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod490ee5e7_c0b1_4181_b7ac_86e5e61253a0.slice/crio-02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26 WatchSource:0}: Error finding container 02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26: Status 404 returned error can't find the container with id 02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26 Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.636943 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.659537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.659767 4755 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: E0320 13:45:37.659851 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist podName:839a8db3-662c-41c4-bb63-6b1027901ab5 nodeName:}" failed. No retries permitted until 2026-03-20 13:45:38.659824945 +0000 UTC m=+918.257757484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist") pod "speaker-6vf4n" (UID: "839a8db3-662c-41c4-bb63-6b1027901ab5") : secret "metallb-memberlist" not found Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.765778 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:37 crc kubenswrapper[4755]: I0320 13:45:37.863517 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-qsbbn"] Mar 20 13:45:37 crc kubenswrapper[4755]: W0320 13:45:37.869762 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda71f1548_62b5_4a77_9655_735bafa396c8.slice/crio-524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43 WatchSource:0}: Error finding container 524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43: Status 404 returned error can't find the container with id 524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43 Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"d081085834f2957c6aa6a031fbdc25b3dc6e8284e1d3b312a5af2d26282ccc95"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328923 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"c1926c59c858fce55632d79e0f6389caba56bce2095eca782f46e6256acea58c"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.328942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-qsbbn" event={"ID":"a71f1548-62b5-4a77-9655-735bafa396c8","Type":"ContainerStarted","Data":"524d41aebcd587b7014132b3aa6b371d5a0c74f501ec19d04ae1763d4cc21c43"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.329307 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.331966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" event={"ID":"490ee5e7-c0b1-4181-b7ac-86e5e61253a0","Type":"ContainerStarted","Data":"02869819a85c04b42664a599188cdae93c7785884830ed36fdd5cb0bf2ccca26"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.333869 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"fe82e7581cc8721da1697a5bf21d324aa60639e7864baf45db43acdeee5d9db4"} Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.354305 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-qsbbn" podStartSLOduration=2.354274147 podStartE2EDuration="2.354274147s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:38.351922383 +0000 UTC m=+917.949854952" watchObservedRunningTime="2026-03-20 13:45:38.354274147 +0000 UTC m=+917.952206716" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.679535 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.688090 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/839a8db3-662c-41c4-bb63-6b1027901ab5-memberlist\") pod \"speaker-6vf4n\" (UID: \"839a8db3-662c-41c4-bb63-6b1027901ab5\") " pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: I0320 13:45:38.788272 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:38 crc kubenswrapper[4755]: W0320 13:45:38.833856 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod839a8db3_662c_41c4_bb63_6b1027901ab5.slice/crio-070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14 WatchSource:0}: Error finding container 070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14: Status 404 returned error can't find the container with id 070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14 Mar 20 13:45:39 crc kubenswrapper[4755]: I0320 13:45:39.349791 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"46f6128ee887d20789e0b1476a1a3cab480f929c2b6aef166e62d3ae334a51cb"} Mar 20 13:45:39 crc kubenswrapper[4755]: I0320 13:45:39.350299 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"070244a038697f091b83b271255f8d27312d4b3362fdbcf3cc2eecc8497c0f14"} Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.359225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-6vf4n" event={"ID":"839a8db3-662c-41c4-bb63-6b1027901ab5","Type":"ContainerStarted","Data":"e59f930c84cd84e3d3023623c4734de877d54ca96977002979d975c208b57b67"} Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.360402 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-6vf4n" Mar 20 13:45:40 crc kubenswrapper[4755]: I0320 13:45:40.396220 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-6vf4n" podStartSLOduration=4.396195169 podStartE2EDuration="4.396195169s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:40.394075631 +0000 UTC m=+919.992008160" watchObservedRunningTime="2026-03-20 13:45:40.396195169 +0000 UTC m=+919.994127698" Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.446016 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="ed53424a28d633c3e292590ed8accef6bafe2195383ff37ed03e25723c93fb40" exitCode=0 Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.446082 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"ed53424a28d633c3e292590ed8accef6bafe2195383ff37ed03e25723c93fb40"} Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.450073 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" event={"ID":"490ee5e7-c0b1-4181-b7ac-86e5e61253a0","Type":"ContainerStarted","Data":"2b85f34fd7d6132829066984b3845c42624a4b339ab8d945fbf7b585b2795b83"} Mar 20 13:45:46 crc kubenswrapper[4755]: I0320 13:45:46.450515 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.459165 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="ac7ec846a4acdfb04966b3d9ee1849da3c74a5dec549e78b7ee3fcd6a0a66380" exitCode=0 Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.459283 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"ac7ec846a4acdfb04966b3d9ee1849da3c74a5dec549e78b7ee3fcd6a0a66380"} Mar 20 13:45:47 crc kubenswrapper[4755]: I0320 13:45:47.513002 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" podStartSLOduration=3.489145042 podStartE2EDuration="11.512972482s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="2026-03-20 13:45:37.63649189 +0000 UTC m=+917.234424419" lastFinishedPulling="2026-03-20 13:45:45.66031933 +0000 UTC m=+925.258251859" observedRunningTime="2026-03-20 13:45:46.494900435 +0000 UTC m=+926.092832964" watchObservedRunningTime="2026-03-20 13:45:47.512972482 +0000 UTC m=+927.110905021" Mar 20 13:45:48 crc kubenswrapper[4755]: I0320 13:45:48.469060 4755 generic.go:334] "Generic (PLEG): container finished" podID="1152c78e-15f9-4826-acc3-3d7f5765db68" containerID="d4c39ae49ed20a2410ce4c58a48e2d2553860145121ddff9391e829737679c5f" exitCode=0 Mar 20 13:45:48 crc kubenswrapper[4755]: I0320 13:45:48.469143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerDied","Data":"d4c39ae49ed20a2410ce4c58a48e2d2553860145121ddff9391e829737679c5f"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"53be76c677173dce74a8ced31f2dae6b2875ad21b7a4360bd7f85ea5468f7d47"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484552 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"307fb133cb09d3633efa9cc9e62abc7b36a901314d32fc82f18e41b671601b89"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"647da65ab737d68b0933db7cc763e0d8d81a12ead2e60418703f0b53fcd7cc6f"} Mar 20 13:45:49 crc kubenswrapper[4755]: I0320 13:45:49.484575 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"56e527f6cffc79a1659a7c5d29290a27f868a61605184138e68459ed6cacf974"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.496247 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"82cfc7bd014a8eaea9b9a148bb9a767450366267f4c09ec4c1743b60797dedee"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.497062 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.497162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-5l5hs" event={"ID":"1152c78e-15f9-4826-acc3-3d7f5765db68","Type":"ContainerStarted","Data":"783fffea87344140407fc6db504afe50d283defe3c73a9a986ef548e1338d514"} Mar 20 13:45:50 crc kubenswrapper[4755]: I0320 13:45:50.525422 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-5l5hs" podStartSLOduration=6.8687956660000005 podStartE2EDuration="14.525389377s" podCreationTimestamp="2026-03-20 13:45:36 +0000 UTC" firstStartedPulling="2026-03-20 13:45:37.960267008 +0000 UTC m=+917.558199567" lastFinishedPulling="2026-03-20 13:45:45.616860749 +0000 UTC m=+925.214793278" observedRunningTime="2026-03-20 13:45:50.521066179 +0000 UTC m=+930.118998748" watchObservedRunningTime="2026-03-20 13:45:50.525389377 +0000 UTC m=+930.123321946" Mar 20 13:45:52 crc kubenswrapper[4755]: I0320 13:45:52.766195 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:52 crc kubenswrapper[4755]: I0320 13:45:52.806356 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:45:57 crc kubenswrapper[4755]: I0320 13:45:57.182999 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-7xgrp" Mar 20 13:45:57 crc kubenswrapper[4755]: I0320 13:45:57.312762 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-qsbbn" Mar 20 13:45:58 crc kubenswrapper[4755]: I0320 13:45:58.792082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-6vf4n" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.143322 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.145349 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.149239 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.153542 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.154373 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.155029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.217814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.319122 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.345020 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"auto-csr-approver-29566906-gwzkg\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:00 crc kubenswrapper[4755]: I0320 13:46:00.477363 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.021389 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:46:01 crc kubenswrapper[4755]: W0320 13:46:01.032902 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47f9ab28_1218_4dcf_a989_728b9063a3e9.slice/crio-2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600 WatchSource:0}: Error finding container 2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600: Status 404 returned error can't find the container with id 2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600 Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.594519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerStarted","Data":"2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600"} Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.685841 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.687770 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.694423 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.694756 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-cq82b" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.696330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.700453 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.750775 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.854826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:01 crc kubenswrapper[4755]: I0320 13:46:01.874800 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"openstack-operator-index-zspb5\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.021061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.507078 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:02 crc kubenswrapper[4755]: W0320 13:46:02.530311 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56977506_369a_4f65_be85_3ff2319ac213.slice/crio-cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a WatchSource:0}: Error finding container cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a: Status 404 returned error can't find the container with id cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a Mar 20 13:46:02 crc kubenswrapper[4755]: I0320 13:46:02.605424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerStarted","Data":"cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a"} Mar 20 13:46:03 crc kubenswrapper[4755]: I0320 13:46:03.614833 4755 generic.go:334] "Generic (PLEG): container finished" podID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerID="cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f" exitCode=0 Mar 20 13:46:03 crc kubenswrapper[4755]: I0320 13:46:03.614948 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerDied","Data":"cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f"} Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.118294 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.316072 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") pod \"47f9ab28-1218-4dcf-a989-728b9063a3e9\" (UID: \"47f9ab28-1218-4dcf-a989-728b9063a3e9\") " Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.323002 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f" (OuterVolumeSpecName: "kube-api-access-nvh4f") pod "47f9ab28-1218-4dcf-a989-728b9063a3e9" (UID: "47f9ab28-1218-4dcf-a989-728b9063a3e9"). InnerVolumeSpecName "kube-api-access-nvh4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.417606 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvh4f\" (UniqueName: \"kubernetes.io/projected/47f9ab28-1218-4dcf-a989-728b9063a3e9-kube-api-access-nvh4f\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634193 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" event={"ID":"47f9ab28-1218-4dcf-a989-728b9063a3e9","Type":"ContainerDied","Data":"2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600"} Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634300 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2beb746c0ebf8762e3e140d0880a971e415d5f5c3c88c29b6e6f1b1257831600" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.634311 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-gwzkg" Mar 20 13:46:05 crc kubenswrapper[4755]: I0320 13:46:05.641334 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.196925 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.204032 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-nvf9d"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.260879 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:06 crc kubenswrapper[4755]: E0320 13:46:06.261243 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261261 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261405 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" containerName="oc" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.261949 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.270967 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.332263 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.434846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.469695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9q8h\" (UniqueName: \"kubernetes.io/projected/b58ff15e-f098-460d-ada4-3bdd990125ba-kube-api-access-m9q8h\") pod \"openstack-operator-index-98hpf\" (UID: \"b58ff15e-f098-460d-ada4-3bdd990125ba\") " pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.583760 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.751614 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.752229 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.752292 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.753329 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:46:06 crc kubenswrapper[4755]: I0320 13:46:06.753414 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" gracePeriod=600 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.021535 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-98hpf"] Mar 20 13:46:07 crc kubenswrapper[4755]: W0320 13:46:07.022689 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58ff15e_f098_460d_ada4_3bdd990125ba.slice/crio-e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb WatchSource:0}: Error finding container e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb: Status 404 returned error can't find the container with id e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.237165 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af9d427-765f-4d25-9603-e0b39103e2cc" path="/var/lib/kubelet/pods/6af9d427-765f-4d25-9603-e0b39103e2cc/volumes" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.653739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98hpf" event={"ID":"b58ff15e-f098-460d-ada4-3bdd990125ba","Type":"ContainerStarted","Data":"0dac578f595955d60f5e85a5f64f072755d5f2116d535d213c2c9c141d43d11a"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.654251 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-98hpf" event={"ID":"b58ff15e-f098-460d-ada4-3bdd990125ba","Type":"ContainerStarted","Data":"e1bd1ae42bde5410f31208636a39a266effd363fc8ed0f31fc24cfd65c4aa5cb"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.656379 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerStarted","Data":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.656571 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zspb5" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" containerID="cri-o://50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" gracePeriod=2 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660702 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" exitCode=0 Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660777 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.660801 4755 scope.go:117] "RemoveContainer" containerID="993f7c13d265997ecffe3a99604c58d76999b476d5321491029634f6fe701d08" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.680055 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-98hpf" podStartSLOduration=1.62983492 podStartE2EDuration="1.680026105s" podCreationTimestamp="2026-03-20 13:46:06 +0000 UTC" firstStartedPulling="2026-03-20 13:46:07.026921608 +0000 UTC m=+946.624854137" lastFinishedPulling="2026-03-20 13:46:07.077112783 +0000 UTC m=+946.675045322" observedRunningTime="2026-03-20 13:46:07.678623318 +0000 UTC m=+947.276555857" watchObservedRunningTime="2026-03-20 13:46:07.680026105 +0000 UTC m=+947.277958664" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.718091 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zspb5" podStartSLOduration=2.6368764000000002 podStartE2EDuration="6.7180589s" podCreationTimestamp="2026-03-20 13:46:01 +0000 UTC" firstStartedPulling="2026-03-20 13:46:02.536494534 +0000 UTC m=+942.134427073" lastFinishedPulling="2026-03-20 13:46:06.617677044 +0000 UTC m=+946.215609573" observedRunningTime="2026-03-20 13:46:07.714865564 +0000 UTC m=+947.312798123" watchObservedRunningTime="2026-03-20 13:46:07.7180589 +0000 UTC m=+947.315991459" Mar 20 13:46:07 crc kubenswrapper[4755]: I0320 13:46:07.770260 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-5l5hs" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.166229 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.364680 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") pod \"56977506-369a-4f65-be85-3ff2319ac213\" (UID: \"56977506-369a-4f65-be85-3ff2319ac213\") " Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.385177 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8" (OuterVolumeSpecName: "kube-api-access-l6xp8") pod "56977506-369a-4f65-be85-3ff2319ac213" (UID: "56977506-369a-4f65-be85-3ff2319ac213"). InnerVolumeSpecName "kube-api-access-l6xp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.467557 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6xp8\" (UniqueName: \"kubernetes.io/projected/56977506-369a-4f65-be85-3ff2319ac213-kube-api-access-l6xp8\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677560 4755 generic.go:334] "Generic (PLEG): container finished" podID="56977506-369a-4f65-be85-3ff2319ac213" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" exitCode=0 Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677742 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zspb5" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.677739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerDied","Data":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.678481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zspb5" event={"ID":"56977506-369a-4f65-be85-3ff2319ac213","Type":"ContainerDied","Data":"cf3fdebd4fa4620f227c64e9d38435d8fc587836313316bd33f23317e15c9c0a"} Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.678519 4755 scope.go:117] "RemoveContainer" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.723052 4755 scope.go:117] "RemoveContainer" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: E0320 13:46:08.725525 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": container with ID starting with 50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f not found: ID does not exist" containerID="50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.725586 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f"} err="failed to get container status \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": rpc error: code = NotFound desc = could not find container \"50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f\": container with ID starting with 50caf008726d10b340618f62fa0a64476b17487f311db120ba0527a5a74ec40f not found: ID does not exist" Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.734809 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:08 crc kubenswrapper[4755]: I0320 13:46:08.743733 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zspb5"] Mar 20 13:46:09 crc kubenswrapper[4755]: I0320 13:46:09.239427 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56977506-369a-4f65-be85-3ff2319ac213" path="/var/lib/kubelet/pods/56977506-369a-4f65-be85-3ff2319ac213/volumes" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.664931 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:13 crc kubenswrapper[4755]: E0320 13:46:13.666832 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.666902 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.667342 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56977506-369a-4f65-be85-3ff2319ac213" containerName="registry-server" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.670003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.682376 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755718 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755836 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.755880 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.857727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.858965 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.859013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:13 crc kubenswrapper[4755]: I0320 13:46:13.883750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"certified-operators-c2c4n\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.038193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.496788 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.730682 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" exitCode=0 Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.730872 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f"} Mar 20 13:46:14 crc kubenswrapper[4755]: I0320 13:46:14.733668 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"4a8482d9f40332c626354ce360719a45a8861fce5e40d0043025c4c874f4eed4"} Mar 20 13:46:15 crc kubenswrapper[4755]: I0320 13:46:15.744803 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.584643 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.585020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.636232 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.756181 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" exitCode=0 Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.756274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} Mar 20 13:46:16 crc kubenswrapper[4755]: I0320 13:46:16.809419 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-98hpf" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.253814 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.256357 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.259514 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312635 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.312723 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413548 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.413686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.414191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.414568 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.441203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"redhat-marketplace-gbxwj\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.626536 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.780436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerStarted","Data":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.809197 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c2c4n" podStartSLOduration=2.329643205 podStartE2EDuration="4.809179352s" podCreationTimestamp="2026-03-20 13:46:13 +0000 UTC" firstStartedPulling="2026-03-20 13:46:14.732983443 +0000 UTC m=+954.330915972" lastFinishedPulling="2026-03-20 13:46:17.21251958 +0000 UTC m=+956.810452119" observedRunningTime="2026-03-20 13:46:17.804090433 +0000 UTC m=+957.402022962" watchObservedRunningTime="2026-03-20 13:46:17.809179352 +0000 UTC m=+957.407111881" Mar 20 13:46:17 crc kubenswrapper[4755]: I0320 13:46:17.898120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.289544 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.291072 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.293803 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mrwws" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.303723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356187 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.356720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.457992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458136 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458276 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.458678 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.484283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.669201 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789280 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e" exitCode=0 Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e"} Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.789780 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"2576fac110f0f8c9949cb9355f303a460a65145a8245ced3044dd2d52050d5d9"} Mar 20 13:46:18 crc kubenswrapper[4755]: I0320 13:46:18.914553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v"] Mar 20 13:46:18 crc kubenswrapper[4755]: W0320 13:46:18.921169 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74e2672d_2bea_46ce_961b_58decbe4a9c4.slice/crio-48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b WatchSource:0}: Error finding container 48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b: Status 404 returned error can't find the container with id 48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.798536 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="036824045df862fed19c828c5797abf74f91ee509bc715ee570a02cfd88be198" exitCode=0 Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.798607 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"036824045df862fed19c828c5797abf74f91ee509bc715ee570a02cfd88be198"} Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.800286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerStarted","Data":"48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b"} Mar 20 13:46:19 crc kubenswrapper[4755]: I0320 13:46:19.803315 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e"} Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.811756 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e" exitCode=0 Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.811864 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e"} Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.814751 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="17772537607eeddbda658477b19b56ca7a81d21afc8bf3352bc4081f64d0344e" exitCode=0 Mar 20 13:46:20 crc kubenswrapper[4755]: I0320 13:46:20.814804 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"17772537607eeddbda658477b19b56ca7a81d21afc8bf3352bc4081f64d0344e"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.828235 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerID="9b8c9133d316b84d95857a5bbf000e1aec6ee4c309b68cb9c7bfd99bf8be3084" exitCode=0 Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.828413 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"9b8c9133d316b84d95857a5bbf000e1aec6ee4c309b68cb9c7bfd99bf8be3084"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.834109 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerStarted","Data":"fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4"} Mar 20 13:46:21 crc kubenswrapper[4755]: I0320 13:46:21.866075 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gbxwj" podStartSLOduration=2.228820443 podStartE2EDuration="4.86605268s" podCreationTimestamp="2026-03-20 13:46:17 +0000 UTC" firstStartedPulling="2026-03-20 13:46:18.792209886 +0000 UTC m=+958.390142415" lastFinishedPulling="2026-03-20 13:46:21.429442083 +0000 UTC m=+961.027374652" observedRunningTime="2026-03-20 13:46:21.863771449 +0000 UTC m=+961.461703988" watchObservedRunningTime="2026-03-20 13:46:21.86605268 +0000 UTC m=+961.463985209" Mar 20 13:46:22 crc kubenswrapper[4755]: I0320 13:46:22.455286 4755 scope.go:117] "RemoveContainer" containerID="5e2db37c71b317712977cebcaa50946c5ceb89b2e6b5818b0d77ab95b610e0b6" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.186970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329548 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329751 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.329779 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") pod \"74e2672d-2bea-46ce-961b-58decbe4a9c4\" (UID: \"74e2672d-2bea-46ce-961b-58decbe4a9c4\") " Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.330327 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle" (OuterVolumeSpecName: "bundle") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.336464 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s" (OuterVolumeSpecName: "kube-api-access-6rd8s") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "kube-api-access-6rd8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.363240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util" (OuterVolumeSpecName: "util") pod "74e2672d-2bea-46ce-961b-58decbe4a9c4" (UID: "74e2672d-2bea-46ce-961b-58decbe4a9c4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431246 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rd8s\" (UniqueName: \"kubernetes.io/projected/74e2672d-2bea-46ce-961b-58decbe4a9c4-kube-api-access-6rd8s\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431284 4755 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.431293 4755 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/74e2672d-2bea-46ce-961b-58decbe4a9c4-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" event={"ID":"74e2672d-2bea-46ce-961b-58decbe4a9c4","Type":"ContainerDied","Data":"48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b"} Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851250 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48cb57890e087a46f2a9b0c3f662831cbf6c8bb7b7b72855ad7974a22de00b3b" Mar 20 13:46:23 crc kubenswrapper[4755]: I0320 13:46:23.851264 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.040092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.040486 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.095546 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:24 crc kubenswrapper[4755]: I0320 13:46:24.937458 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.444138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.445099 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c2c4n" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" containerID="cri-o://008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" gracePeriod=2 Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.627446 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.629404 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.707535 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.812759 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887115 4755 generic.go:334] "Generic (PLEG): container finished" podID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" exitCode=0 Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887249 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c2c4n" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c2c4n" event={"ID":"dec99880-d50d-4204-a8d5-0079e4175e5c","Type":"ContainerDied","Data":"4a8482d9f40332c626354ce360719a45a8861fce5e40d0043025c4c874f4eed4"} Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.887362 4755 scope.go:117] "RemoveContainer" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.905630 4755 scope.go:117] "RemoveContainer" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.936343 4755 scope.go:117] "RemoveContainer" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.943120 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.961717 4755 scope.go:117] "RemoveContainer" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.962476 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": container with ID starting with 008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f not found: ID does not exist" containerID="008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.962543 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f"} err="failed to get container status \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": rpc error: code = NotFound desc = could not find container \"008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f\": container with ID starting with 008aa910ad5f19e02f325485f78f5958b153a6e31542ae4627e963f94f325e1f not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.962587 4755 scope.go:117] "RemoveContainer" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.963313 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": container with ID starting with a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad not found: ID does not exist" containerID="a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.963360 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad"} err="failed to get container status \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": rpc error: code = NotFound desc = could not find container \"a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad\": container with ID starting with a02940dc83a55da04b593b63a1382dc9bc1e467d78d3ba09604899476b0580ad not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.963387 4755 scope.go:117] "RemoveContainer" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: E0320 13:46:27.964235 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": container with ID starting with 7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f not found: ID does not exist" containerID="7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.964277 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f"} err="failed to get container status \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": rpc error: code = NotFound desc = could not find container \"7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f\": container with ID starting with 7e3ba586d801180d2a28b31c3f388a3190852d5958e983fd727b912235e5aa9f not found: ID does not exist" Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999162 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999285 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:27 crc kubenswrapper[4755]: I0320 13:46:27.999497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") pod \"dec99880-d50d-4204-a8d5-0079e4175e5c\" (UID: \"dec99880-d50d-4204-a8d5-0079e4175e5c\") " Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.003479 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities" (OuterVolumeSpecName: "utilities") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.007721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph" (OuterVolumeSpecName: "kube-api-access-xt9ph") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "kube-api-access-xt9ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.055781 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dec99880-d50d-4204-a8d5-0079e4175e5c" (UID: "dec99880-d50d-4204-a8d5-0079e4175e5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100832 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt9ph\" (UniqueName: \"kubernetes.io/projected/dec99880-d50d-4204-a8d5-0079e4175e5c-kube-api-access-xt9ph\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100867 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.100877 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dec99880-d50d-4204-a8d5-0079e4175e5c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.225147 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:28 crc kubenswrapper[4755]: I0320 13:46:28.231753 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c2c4n"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.246306 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" path="/var/lib/kubelet/pods/dec99880-d50d-4204-a8d5-0079e4175e5c/volumes" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647133 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.647752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="pull" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647846 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="pull" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.647918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="util" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.647970 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="util" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648024 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-utilities" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-utilities" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648136 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648191 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648248 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-content" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648297 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="extract-content" Mar 20 13:46:29 crc kubenswrapper[4755]: E0320 13:46:29.648350 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648401 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648561 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec99880-d50d-4204-a8d5-0079e4175e5c" containerName="registry-server" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.648624 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e2672d-2bea-46ce-961b-58decbe4a9c4" containerName="extract" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.649578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.665913 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.721256 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.723102 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.726690 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-rvkg9" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.745038 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827608 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827739 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.827834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929729 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929830 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.929891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.930691 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.931257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.931330 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.950060 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"community-operators-lxv97\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.950113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9ttj\" (UniqueName: \"kubernetes.io/projected/e837c2d9-26ab-47a1-b48a-44f28fc2e2a6-kube-api-access-x9ttj\") pod \"openstack-operator-controller-init-6c645d7445-cbmxt\" (UID: \"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6\") " pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:29 crc kubenswrapper[4755]: I0320 13:46:29.965589 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.038474 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.283245 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:30 crc kubenswrapper[4755]: W0320 13:46:30.290460 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd61b3b_557c_4bfc_9ca3_784180e34cf4.slice/crio-f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c WatchSource:0}: Error finding container f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c: Status 404 returned error can't find the container with id f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.597484 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt"] Mar 20 13:46:30 crc kubenswrapper[4755]: W0320 13:46:30.609247 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode837c2d9_26ab_47a1_b48a_44f28fc2e2a6.slice/crio-9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540 WatchSource:0}: Error finding container 9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540: Status 404 returned error can't find the container with id 9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540 Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911827 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359" exitCode=0 Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911921 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359"} Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.911978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerStarted","Data":"f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c"} Mar 20 13:46:30 crc kubenswrapper[4755]: I0320 13:46:30.913930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" event={"ID":"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6","Type":"ContainerStarted","Data":"9f7cdb96c5f147d38ca8c02eda5c766c505749e73a1f77e915ccb0c10dcc8540"} Mar 20 13:46:31 crc kubenswrapper[4755]: I0320 13:46:31.933027 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7" exitCode=0 Mar 20 13:46:31 crc kubenswrapper[4755]: I0320 13:46:31.934250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7"} Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.243196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.243438 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gbxwj" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" containerID="cri-o://fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" gracePeriod=2 Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.943285 4755 generic.go:334] "Generic (PLEG): container finished" podID="47277032-9d6e-4e0e-81a1-42a899786245" containerID="fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" exitCode=0 Mar 20 13:46:32 crc kubenswrapper[4755]: I0320 13:46:32.943350 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.351725 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439227 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439319 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.439385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") pod \"47277032-9d6e-4e0e-81a1-42a899786245\" (UID: \"47277032-9d6e-4e0e-81a1-42a899786245\") " Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.440604 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities" (OuterVolumeSpecName: "utilities") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.447873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs" (OuterVolumeSpecName: "kube-api-access-6d4xs") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "kube-api-access-6d4xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.480377 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47277032-9d6e-4e0e-81a1-42a899786245" (UID: "47277032-9d6e-4e0e-81a1-42a899786245"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541495 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6d4xs\" (UniqueName: \"kubernetes.io/projected/47277032-9d6e-4e0e-81a1-42a899786245-kube-api-access-6d4xs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541526 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.541539 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47277032-9d6e-4e0e-81a1-42a899786245-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.969018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerStarted","Data":"855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972288 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gbxwj" event={"ID":"47277032-9d6e-4e0e-81a1-42a899786245","Type":"ContainerDied","Data":"2576fac110f0f8c9949cb9355f303a460a65145a8245ced3044dd2d52050d5d9"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972348 4755 scope.go:117] "RemoveContainer" containerID="fa0580ec23b9e2558f6029d0dae41d266eaa8c95386cb96e0c17868d2f3866f4" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.972496 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gbxwj" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.976497 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" event={"ID":"e837c2d9-26ab-47a1-b48a-44f28fc2e2a6","Type":"ContainerStarted","Data":"c0d02e673f830c04ff018db3b765586e7afdf71e7fee50fab41ea873a737d1bf"} Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.976681 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:35 crc kubenswrapper[4755]: I0320 13:46:35.992132 4755 scope.go:117] "RemoveContainer" containerID="758a773ec6c0ec479a241ed92839ca65f8621b43db226d9218b7999ca9f5568e" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.012349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lxv97" podStartSLOduration=2.391152284 podStartE2EDuration="7.012316955s" podCreationTimestamp="2026-03-20 13:46:29 +0000 UTC" firstStartedPulling="2026-03-20 13:46:30.913810428 +0000 UTC m=+970.511742947" lastFinishedPulling="2026-03-20 13:46:35.534975089 +0000 UTC m=+975.132907618" observedRunningTime="2026-03-20 13:46:35.996492444 +0000 UTC m=+975.594424973" watchObservedRunningTime="2026-03-20 13:46:36.012316955 +0000 UTC m=+975.610249484" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.013938 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.019862 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gbxwj"] Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.020978 4755 scope.go:117] "RemoveContainer" containerID="77b04896a1353bc4a2e73a9e9c84e70a96936d649c2d4e962bf525f42511fa6e" Mar 20 13:46:36 crc kubenswrapper[4755]: I0320 13:46:36.041317 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" podStartSLOduration=2.048812521 podStartE2EDuration="7.041294763s" podCreationTimestamp="2026-03-20 13:46:29 +0000 UTC" firstStartedPulling="2026-03-20 13:46:30.612622345 +0000 UTC m=+970.210554874" lastFinishedPulling="2026-03-20 13:46:35.605104577 +0000 UTC m=+975.203037116" observedRunningTime="2026-03-20 13:46:36.040386439 +0000 UTC m=+975.638319008" watchObservedRunningTime="2026-03-20 13:46:36.041294763 +0000 UTC m=+975.639227292" Mar 20 13:46:37 crc kubenswrapper[4755]: I0320 13:46:37.232982 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47277032-9d6e-4e0e-81a1-42a899786245" path="/var/lib/kubelet/pods/47277032-9d6e-4e0e-81a1-42a899786245/volumes" Mar 20 13:46:39 crc kubenswrapper[4755]: I0320 13:46:39.967177 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:39 crc kubenswrapper[4755]: I0320 13:46:39.967867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.044424 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6c645d7445-cbmxt" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.047006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:40 crc kubenswrapper[4755]: I0320 13:46:40.141080 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:42 crc kubenswrapper[4755]: I0320 13:46:42.447399 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:42 crc kubenswrapper[4755]: I0320 13:46:42.448182 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lxv97" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" containerID="cri-o://855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" gracePeriod=2 Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.069092 4755 generic.go:334] "Generic (PLEG): container finished" podID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerID="855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" exitCode=0 Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.069137 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b"} Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.384709 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.564016 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.565710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.565955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") pod \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\" (UID: \"2cd61b3b-557c-4bfc-9ca3-784180e34cf4\") " Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.567191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities" (OuterVolumeSpecName: "utilities") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.574408 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h" (OuterVolumeSpecName: "kube-api-access-66k9h") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "kube-api-access-66k9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.623547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2cd61b3b-557c-4bfc-9ca3-784180e34cf4" (UID: "2cd61b3b-557c-4bfc-9ca3-784180e34cf4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668293 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66k9h\" (UniqueName: \"kubernetes.io/projected/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-kube-api-access-66k9h\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668354 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:43 crc kubenswrapper[4755]: I0320 13:46:43.668379 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2cd61b3b-557c-4bfc-9ca3-784180e34cf4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.089551 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lxv97" event={"ID":"2cd61b3b-557c-4bfc-9ca3-784180e34cf4","Type":"ContainerDied","Data":"f6e3d9c32ee169e04965cba1766d1a594a6bef7604eb9e3d6059f565709ec29c"} Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.089773 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lxv97" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.090396 4755 scope.go:117] "RemoveContainer" containerID="855ced0f4278fbd305a47718a7a8fa79560f8e11086f9d14e06ea490815e2c7b" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.109475 4755 scope.go:117] "RemoveContainer" containerID="cfeab6a9dcefec03928d4fe10a66cdd355e4a9240128bfce4f125c6b8e4019f7" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.124181 4755 scope.go:117] "RemoveContainer" containerID="7af298ba286ba398de5b17fcdc33ee1e8e40174a379e05601893aa3b49135359" Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.187009 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:44 crc kubenswrapper[4755]: I0320 13:46:44.194380 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lxv97"] Mar 20 13:46:45 crc kubenswrapper[4755]: I0320 13:46:45.232593 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" path="/var/lib/kubelet/pods/2cd61b3b-557c-4bfc-9ca3-784180e34cf4/volumes" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.361018 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.361970 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.361990 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362010 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362018 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362032 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362039 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362051 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362071 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362078 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.362092 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362099 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362239 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd61b3b-557c-4bfc-9ca3-784180e34cf4" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362261 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="47277032-9d6e-4e0e-81a1-42a899786245" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.362805 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.364602 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-wbx7b" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.380595 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.381427 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.384354 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-rk298" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.396167 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.406674 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.407613 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.420346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-r9t7k" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.422840 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.427339 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.428348 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.436841 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-xb8qd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.441079 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.447354 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.448425 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.450271 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-wqxv7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.457572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.465665 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.486495 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.487630 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.489703 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.490391 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.490973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ttdmd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.514678 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l7q5f" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.514880 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.516028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.516133 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.576150 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.589744 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.600335 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.601448 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.613073 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-f9ccv" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.613260 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617071 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617116 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617150 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617186 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617212 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617230 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.617288 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.646955 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.648362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654179 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-56pln" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654550 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.654568 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp8x9\" (UniqueName: \"kubernetes.io/projected/3a22a8d8-92cd-4177-a597-9c659673392c-kube-api-access-pp8x9\") pod \"barbican-operator-controller-manager-59bc569d95-ds4tb\" (UID: \"3a22a8d8-92cd-4177-a597-9c659673392c\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.658872 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkwj7\" (UniqueName: \"kubernetes.io/projected/4c1ba89a-aed6-4245-8411-4d1fecac2500-kube-api-access-pkwj7\") pod \"cinder-operator-controller-manager-8d58dc466-pr9d5\" (UID: \"4c1ba89a-aed6-4245-8411-4d1fecac2500\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.667547 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.668473 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.676076 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nkxw2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.684013 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.697381 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.698369 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.699519 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.701151 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hk896" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723539 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723612 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723640 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723736 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.723792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.724153 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:58 crc kubenswrapper[4755]: E0320 13:46:58.724199 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:46:59.224178837 +0000 UTC m=+998.822111366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.711568 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.730703 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.773603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7ntp\" (UniqueName: \"kubernetes.io/projected/552e0390-e86e-4972-bf6f-a4570e6b6f81-kube-api-access-h7ntp\") pod \"heat-operator-controller-manager-67dd5f86f5-92fwj\" (UID: \"552e0390-e86e-4972-bf6f-a4570e6b6f81\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.777828 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.779698 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.783131 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-lvnsc" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpswh\" (UniqueName: \"kubernetes.io/projected/bc80030a-428b-4643-9d8d-2b0e9c873060-kube-api-access-tpswh\") pod \"glance-operator-controller-manager-79df6bcc97-cbj27\" (UID: \"bc80030a-428b-4643-9d8d-2b0e9c873060\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786852 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmtx\" (UniqueName: \"kubernetes.io/projected/83d6120d-b54b-452c-aa8a-026665f1afae-kube-api-access-qbmtx\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.786963 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtf4x\" (UniqueName: \"kubernetes.io/projected/52210224-8989-4e16-8fdf-4ea3a8211b10-kube-api-access-qtf4x\") pod \"horizon-operator-controller-manager-8464cc45fb-hxmnd\" (UID: \"52210224-8989-4e16-8fdf-4ea3a8211b10\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.793753 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg98q\" (UniqueName: \"kubernetes.io/projected/00fc80a4-4ea8-4f61-8795-6473f0adc40a-kube-api-access-bg98q\") pod \"designate-operator-controller-manager-588d4d986b-xd8mk\" (UID: \"00fc80a4-4ea8-4f61-8795-6473f0adc40a\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.800742 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.801765 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.805122 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.805173 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nwt2m" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.820811 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.822310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.824524 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-cgp2k" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825275 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825340 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825436 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.825605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.833391 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.841980 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.861382 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.874616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg89d\" (UniqueName: \"kubernetes.io/projected/21c9358d-2c84-4c38-9c91-8ca3dad4dab7-kube-api-access-qg89d\") pod \"ironic-operator-controller-manager-6f787dddc9-f2nbs\" (UID: \"21c9358d-2c84-4c38-9c91-8ca3dad4dab7\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.887859 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.892975 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.897949 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.899041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.900569 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-qk8zk" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.902997 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.903347 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n4kd2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.918798 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927082 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927213 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.927357 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.928980 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.943602 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.944577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.956087 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.957601 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-69t7b" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.967057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbcwh\" (UniqueName: \"kubernetes.io/projected/fe4ddc70-f382-4b32-8879-122023b45438-kube-api-access-qbcwh\") pod \"mariadb-operator-controller-manager-67ccfc9778-2d9hb\" (UID: \"fe4ddc70-f382-4b32-8879-122023b45438\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.975975 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjn75\" (UniqueName: \"kubernetes.io/projected/7f51051e-6a90-4582-a411-28a106c37118-kube-api-access-cjn75\") pod \"manila-operator-controller-manager-55f864c847-gszjd\" (UID: \"7f51051e-6a90-4582-a411-28a106c37118\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.978365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdzd7\" (UniqueName: \"kubernetes.io/projected/5a83ca27-3334-4aac-9129-5635d3af0714-kube-api-access-kdzd7\") pod \"keystone-operator-controller-manager-768b96df4c-4x5nd\" (UID: \"5a83ca27-3334-4aac-9129-5635d3af0714\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.990681 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.992664 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:58 crc kubenswrapper[4755]: I0320 13:46:58.999424 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-v52dq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.012235 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.012948 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.023999 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028634 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028688 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028732 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.028764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.036460 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.046488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.049137 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb8qk\" (UniqueName: \"kubernetes.io/projected/1aaef0d5-16fe-4c61-82d5-660f29168171-kube-api-access-mb8qk\") pod \"neutron-operator-controller-manager-767865f676-c8crg\" (UID: \"1aaef0d5-16fe-4c61-82d5-660f29168171\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.050219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjwwg\" (UniqueName: \"kubernetes.io/projected/b3c037b9-79d2-45ea-9b92-66e50eb20e6b-kube-api-access-tjwwg\") pod \"octavia-operator-controller-manager-5b9f45d989-c5nbs\" (UID: \"b3c037b9-79d2-45ea-9b92-66e50eb20e6b\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.064855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz24w\" (UniqueName: \"kubernetes.io/projected/358d4809-db3b-4468-8c8c-4ffbedc0ec89-kube-api-access-wz24w\") pod \"nova-operator-controller-manager-5d488d59fb-r8jn7\" (UID: \"358d4809-db3b-4468-8c8c-4ffbedc0ec89\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.072341 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.079734 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.109033 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.116091 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-gdw6q" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.125716 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134027 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134187 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134228 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.134267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.136041 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.137910 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.137972 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:46:59.637951215 +0000 UTC m=+999.235883744 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.145273 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.162728 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.172303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncn8t\" (UniqueName: \"kubernetes.io/projected/bad91c65-94da-4f8a-addb-21b037197217-kube-api-access-ncn8t\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.174415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8b2z\" (UniqueName: \"kubernetes.io/projected/72ea5d65-7221-4b25-9025-7a5c31bae331-kube-api-access-b8b2z\") pod \"ovn-operator-controller-manager-884679f54-2xgt8\" (UID: \"72ea5d65-7221-4b25-9025-7a5c31bae331\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.219428 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235851 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235910 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.235985 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.236012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.236455 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.236502 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.236484795 +0000 UTC m=+999.834417324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.257354 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fz62\" (UniqueName: \"kubernetes.io/projected/a1b32bae-fa65-45aa-a8db-b46a7351ee2c-kube-api-access-4fz62\") pod \"placement-operator-controller-manager-5784578c99-j7qf2\" (UID: \"a1b32bae-fa65-45aa-a8db-b46a7351ee2c\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.258406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.267748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.268581 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.268695 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.272842 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgm46\" (UniqueName: \"kubernetes.io/projected/14b4b9ba-026c-4fd7-a57d-545e62b6981e-kube-api-access-hgm46\") pod \"swift-operator-controller-manager-c674c5965-bfh6x\" (UID: \"14b4b9ba-026c-4fd7-a57d-545e62b6981e\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.272851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-69lnm" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.295401 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.332204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.337532 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.349128 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.357970 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.360301 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.368257 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5x78\" (UniqueName: \"kubernetes.io/projected/bc93761d-ecc1-4179-8287-40fd76ba5ad1-kube-api-access-h5x78\") pod \"telemetry-operator-controller-manager-d6b694c5-hpmzq\" (UID: \"bc93761d-ecc1-4179-8287-40fd76ba5ad1\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.369194 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lbhd4" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.373287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.404123 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.405164 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.405612 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.408305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.408448 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.409137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7dlnc" Mar 20 13:46:59 crc kubenswrapper[4755]: W0320 13:46:59.425447 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a22a8d8_92cd_4177_a597_9c659673392c.slice/crio-5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4 WatchSource:0}: Error finding container 5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4: Status 404 returned error can't find the container with id 5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4 Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.428029 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.440921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.451414 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.472306 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.516833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542792 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.542930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.544171 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.595782 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz6tz\" (UniqueName: \"kubernetes.io/projected/26dfba7a-f5fa-45bc-a187-91ddce4da2d6-kube-api-access-wz6tz\") pod \"test-operator-controller-manager-5c5cb9c4d7-4khh5\" (UID: \"26dfba7a-f5fa-45bc-a187-91ddce4da2d6\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.602919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh9w7\" (UniqueName: \"kubernetes.io/projected/88862bd4-c890-447c-b4ee-b9cb1a4928e8-kube-api-access-zh9w7\") pod \"watcher-operator-controller-manager-6c4d75f7f9-z9px7\" (UID: \"88862bd4-c890-447c-b4ee-b9cb1a4928e8\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.613451 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645852 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645904 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.645953 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.646003 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646147 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646216 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.146186962 +0000 UTC m=+999.744119481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646577 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646609 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.146598303 +0000 UTC m=+999.744530832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646678 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: E0320 13:46:59.646701 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:00.646695186 +0000 UTC m=+1000.244627715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.733599 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrvl2\" (UniqueName: \"kubernetes.io/projected/42c9c167-c386-4d60-868c-8b0b63fccbcd-kube-api-access-hrvl2\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.795071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.848003 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk"] Mar 20 13:46:59 crc kubenswrapper[4755]: I0320 13:46:59.972729 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj"] Mar 20 13:46:59 crc kubenswrapper[4755]: W0320 13:46:59.983149 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552e0390_e86e_4972_bf6f_a4570e6b6f81.slice/crio-ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff WatchSource:0}: Error finding container ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff: Status 404 returned error can't find the container with id ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.085635 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.102126 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.111213 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc80030a_428b_4643_9d8d_2b0e9c873060.slice/crio-a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f WatchSource:0}: Error finding container a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f: Status 404 returned error can't find the container with id a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.158498 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.158610 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.158757 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.158817 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:01.158797278 +0000 UTC m=+1000.756729807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.159244 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.159331 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:01.159308191 +0000 UTC m=+1000.757240710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.198778 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gszjd"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.208434 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f51051e_6a90_4582_a411_28a106c37118.slice/crio-b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750 WatchSource:0}: Error finding container b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750: Status 404 returned error can't find the container with id b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.231614 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.241565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" event={"ID":"5a83ca27-3334-4aac-9129-5635d3af0714","Type":"ContainerStarted","Data":"9912ebc9c17c1fc59076b1178f5fa6620b1b1314c2609f6fa2c634f98c212c6a"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.243116 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" event={"ID":"552e0390-e86e-4972-bf6f-a4570e6b6f81","Type":"ContainerStarted","Data":"ec96abd0b7a1a250e1537f60ee80406d792b98f346488a87ce3be0ecdd8643ff"} Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.243608 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3c037b9_79d2_45ea_9b92_66e50eb20e6b.slice/crio-a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d WatchSource:0}: Error finding container a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d: Status 404 returned error can't find the container with id a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.261347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.261686 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.261749 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:02.261722227 +0000 UTC m=+1001.859654756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.263114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" event={"ID":"00fc80a4-4ea8-4f61-8795-6473f0adc40a","Type":"ContainerStarted","Data":"c82bf96563ace2a79cfd0edc367ecb82d6c8b7bd05f818afcbed05237bcff328"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.266849 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" event={"ID":"7f51051e-6a90-4582-a411-28a106c37118","Type":"ContainerStarted","Data":"b5ee579ffe9b23032143558fb51343b4470ef58419dc41f55e62986ec60ad750"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.270072 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-c8crg"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.284424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" event={"ID":"21c9358d-2c84-4c38-9c91-8ca3dad4dab7","Type":"ContainerStarted","Data":"31f726768bac25884e79e07c0eebcdc8bdfdffb1c222c432046d6d7b0aa6cbe3"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.287522 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" event={"ID":"4c1ba89a-aed6-4245-8411-4d1fecac2500","Type":"ContainerStarted","Data":"fab89acb33673d7163cf70dd5e20b02a7c5cfe8640d4de7f772733bc5c58d3cb"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.290193 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" event={"ID":"52210224-8989-4e16-8fdf-4ea3a8211b10","Type":"ContainerStarted","Data":"f1e539c8860b9e1818c85b8fd992e545698bb5822d4096f90949ff0ae808c317"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.292311 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" event={"ID":"bc80030a-428b-4643-9d8d-2b0e9c873060","Type":"ContainerStarted","Data":"a0a2c281c33a2022af9a707c6350db19040770390fa2f0ca8eca3246aa9d7b8f"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.294472 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" event={"ID":"fe4ddc70-f382-4b32-8879-122023b45438","Type":"ContainerStarted","Data":"8c3c232ec66ab427da213fe84bb964a64e40f8c0b5af7bf549ab7f717fa8d37e"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.296044 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" event={"ID":"3a22a8d8-92cd-4177-a597-9c659673392c","Type":"ContainerStarted","Data":"5a18af11c4c6ef06963999c97c07e9f940205369ec54af0f5d052d46549137d4"} Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.347689 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.352414 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ea5d65_7221_4b25_9025_7a5c31bae331.slice/crio-5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5 WatchSource:0}: Error finding container 5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5: Status 404 returned error can't find the container with id 5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5 Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.354461 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b32bae_fa65_45aa_a8db_b46a7351ee2c.slice/crio-f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326 WatchSource:0}: Error finding container f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326: Status 404 returned error can't find the container with id f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.354547 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.361236 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod358d4809_db3b_4468_8c8c_4ffbedc0ec89.slice/crio-7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e WatchSource:0}: Error finding container 7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e: Status 404 returned error can't find the container with id 7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.364814 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2"] Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.366718 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wz24w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-r8jn7_openstack-operators(358d4809-db3b-4468-8c8c-4ffbedc0ec89): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.367847 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.423997 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq"] Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.431985 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.434822 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc93761d_ecc1_4179_8287_40fd76ba5ad1.slice/crio-2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1 WatchSource:0}: Error finding container 2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1: Status 404 returned error can't find the container with id 2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1 Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.435311 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26dfba7a_f5fa_45bc_a187_91ddce4da2d6.slice/crio-8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f WatchSource:0}: Error finding container 8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f: Status 404 returned error can't find the container with id 8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.437672 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x"] Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.442073 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h5x78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-hpmzq_openstack-operators(bc93761d-ecc1-4179-8287-40fd76ba5ad1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.443439 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.444277 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b4b9ba_026c_4fd7_a57d_545e62b6981e.slice/crio-e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c WatchSource:0}: Error finding container e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c: Status 404 returned error can't find the container with id e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.446789 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgm46,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-bfh6x_openstack-operators(14b4b9ba-026c-4fd7-a57d-545e62b6981e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.447981 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.604358 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7"] Mar 20 13:47:00 crc kubenswrapper[4755]: W0320 13:47:00.619116 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88862bd4_c890_447c_b4ee_b9cb1a4928e8.slice/crio-f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4 WatchSource:0}: Error finding container f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4: Status 404 returned error can't find the container with id f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4 Mar 20 13:47:00 crc kubenswrapper[4755]: I0320 13:47:00.667442 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.667676 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:00 crc kubenswrapper[4755]: E0320 13:47:00.668067 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:02.668032421 +0000 UTC m=+1002.265964950 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.176019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.176139 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176237 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176339 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:03.176314909 +0000 UTC m=+1002.774247438 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176343 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.176457 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:03.176430982 +0000 UTC m=+1002.774363601 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.351612 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" event={"ID":"bc93761d-ecc1-4179-8287-40fd76ba5ad1","Type":"ContainerStarted","Data":"2baf61fb5caa161caec07fc2add652a85218c70f848cfbc9ca94de6997ec71e1"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.354844 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.405080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" event={"ID":"14b4b9ba-026c-4fd7-a57d-545e62b6981e","Type":"ContainerStarted","Data":"e9751dea37aa6c86a2be69de16a482043076e9cae57be68af3de8845e59a5c4c"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.406449 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.410689 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" event={"ID":"b3c037b9-79d2-45ea-9b92-66e50eb20e6b","Type":"ContainerStarted","Data":"a58800c9634b9e9170aac5fb4c72c58374447629ed5375942fff246869ff823d"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.412740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" event={"ID":"a1b32bae-fa65-45aa-a8db-b46a7351ee2c","Type":"ContainerStarted","Data":"f68c1761b64f97ee42170a8c4a8acbe6606a29dfb4bd234c309677154ef27326"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.420840 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" event={"ID":"72ea5d65-7221-4b25-9025-7a5c31bae331","Type":"ContainerStarted","Data":"5b3691adace724ede20b6ac9feac8441a7a86423cd8459fb2ec29265d4636bb5"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.425984 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" event={"ID":"88862bd4-c890-447c-b4ee-b9cb1a4928e8","Type":"ContainerStarted","Data":"f67daacea111247b6f2cf7784a8e05f6ddedb94eda97fe9fa92e843b470a40b4"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.432332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" event={"ID":"358d4809-db3b-4468-8c8c-4ffbedc0ec89","Type":"ContainerStarted","Data":"7d2778b0ac37c07eedca53aa45ebffd470e1461556be577080c46317980aba9e"} Mar 20 13:47:01 crc kubenswrapper[4755]: E0320 13:47:01.434273 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.441972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" event={"ID":"1aaef0d5-16fe-4c61-82d5-660f29168171","Type":"ContainerStarted","Data":"18344850f9150281d5e9e317dc947a0e7719d553176de1cfac37f0292e5649cc"} Mar 20 13:47:01 crc kubenswrapper[4755]: I0320 13:47:01.443853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" event={"ID":"26dfba7a-f5fa-45bc-a187-91ddce4da2d6","Type":"ContainerStarted","Data":"8d5b910da1fcca825bd4f92db79f5d5ad861fa715226135c592518e8564d350f"} Mar 20 13:47:02 crc kubenswrapper[4755]: I0320 13:47:02.302132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.302317 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.302409 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:06.302385824 +0000 UTC m=+1005.900318353 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.525886 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podUID="358d4809-db3b-4468-8c8c-4ffbedc0ec89" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.526282 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podUID="14b4b9ba-026c-4fd7-a57d-545e62b6981e" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.532258 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podUID="bc93761d-ecc1-4179-8287-40fd76ba5ad1" Mar 20 13:47:02 crc kubenswrapper[4755]: I0320 13:47:02.708607 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.708824 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:02 crc kubenswrapper[4755]: E0320 13:47:02.708922 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:06.708896813 +0000 UTC m=+1006.306829332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: I0320 13:47:03.217105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:03 crc kubenswrapper[4755]: I0320 13:47:03.217217 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217371 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217381 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217438 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:07.217417388 +0000 UTC m=+1006.815349917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:03 crc kubenswrapper[4755]: E0320 13:47:03.217494 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:07.217459619 +0000 UTC m=+1006.815392248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: I0320 13:47:06.393760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.394062 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.394260 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:14.394235544 +0000 UTC m=+1013.992168073 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: I0320 13:47:06.799922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.800201 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:06 crc kubenswrapper[4755]: E0320 13:47:06.800320 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:14.800292641 +0000 UTC m=+1014.398225180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: I0320 13:47:07.308889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:07 crc kubenswrapper[4755]: I0320 13:47:07.309537 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.309798 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.309975 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:15.309909636 +0000 UTC m=+1014.907842245 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.310012 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:07 crc kubenswrapper[4755]: E0320 13:47:07.310137 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:15.310110061 +0000 UTC m=+1014.908042590 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: I0320 13:47:14.428554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.428965 4755 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.429482 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert podName:83d6120d-b54b-452c-aa8a-026665f1afae nodeName:}" failed. No retries permitted until 2026-03-20 13:47:30.429448225 +0000 UTC m=+1030.027380764 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert") pod "infra-operator-controller-manager-7b55fff5bb-sm4wg" (UID: "83d6120d-b54b-452c-aa8a-026665f1afae") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: I0320 13:47:14.836132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.836724 4755 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:14 crc kubenswrapper[4755]: E0320 13:47:14.836971 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert podName:bad91c65-94da-4f8a-addb-21b037197217 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:30.836941161 +0000 UTC m=+1030.434873730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert") pod "openstack-baremetal-operator-controller-manager-86657c54f58nx7v" (UID: "bad91c65-94da-4f8a-addb-21b037197217") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: I0320 13:47:15.346809 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:15 crc kubenswrapper[4755]: I0320 13:47:15.347591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347000 4755 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347717 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:31.347687397 +0000 UTC m=+1030.945619936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "metrics-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347861 4755 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.347972 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs podName:42c9c167-c386-4d60-868c-8b0b63fccbcd nodeName:}" failed. No retries permitted until 2026-03-20 13:47:31.347942303 +0000 UTC m=+1030.945874842 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs") pod "openstack-operator-controller-manager-795d5ff795-ld7m6" (UID: "42c9c167-c386-4d60-868c-8b0b63fccbcd") : secret "webhook-server-cert" not found Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.887598 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.887967 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b8b2z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-2xgt8_openstack-operators(72ea5d65-7221-4b25-9025-7a5c31bae331): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:15 crc kubenswrapper[4755]: E0320 13:47:15.889144 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podUID="72ea5d65-7221-4b25-9025-7a5c31bae331" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.465330 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.465563 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qg89d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-f2nbs_openstack-operators(21c9358d-2c84-4c38-9c91-8ca3dad4dab7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.466817 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podUID="21c9358d-2c84-4c38-9c91-8ca3dad4dab7" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.640352 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podUID="72ea5d65-7221-4b25-9025-7a5c31bae331" Mar 20 13:47:16 crc kubenswrapper[4755]: E0320 13:47:16.641419 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podUID="21c9358d-2c84-4c38-9c91-8ca3dad4dab7" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.092205 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.092700 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mb8qk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-c8crg_openstack-operators(1aaef0d5-16fe-4c61-82d5-660f29168171): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.094007 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podUID="1aaef0d5-16fe-4c61-82d5-660f29168171" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.563351 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.563558 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wz6tz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-4khh5_openstack-operators(26dfba7a-f5fa-45bc-a187-91ddce4da2d6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.564745 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podUID="26dfba7a-f5fa-45bc-a187-91ddce4da2d6" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.648032 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podUID="26dfba7a-f5fa-45bc-a187-91ddce4da2d6" Mar 20 13:47:17 crc kubenswrapper[4755]: E0320 13:47:17.651357 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podUID="1aaef0d5-16fe-4c61-82d5-660f29168171" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.057830 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.058088 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fz62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-j7qf2_openstack-operators(a1b32bae-fa65-45aa-a8db-b46a7351ee2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.059290 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podUID="a1b32bae-fa65-45aa-a8db-b46a7351ee2c" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.668118 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podUID="a1b32bae-fa65-45aa-a8db-b46a7351ee2c" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.774277 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.774494 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zh9w7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-z9px7_openstack-operators(88862bd4-c890-447c-b4ee-b9cb1a4928e8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:19 crc kubenswrapper[4755]: E0320 13:47:19.775676 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podUID="88862bd4-c890-447c-b4ee-b9cb1a4928e8" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.674395 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podUID="88862bd4-c890-447c-b4ee-b9cb1a4928e8" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.972852 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.973093 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kdzd7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-4x5nd_openstack-operators(5a83ca27-3334-4aac-9129-5635d3af0714): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:47:20 crc kubenswrapper[4755]: E0320 13:47:20.974339 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podUID="5a83ca27-3334-4aac-9129-5635d3af0714" Mar 20 13:47:21 crc kubenswrapper[4755]: E0320 13:47:21.683708 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podUID="5a83ca27-3334-4aac-9129-5635d3af0714" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.702278 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" event={"ID":"bc93761d-ecc1-4179-8287-40fd76ba5ad1","Type":"ContainerStarted","Data":"51f77d82208f79ab9b94d6de5a02f4516479d986ab94fd2d8e56200808a34d71"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.702975 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.708963 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" event={"ID":"14b4b9ba-026c-4fd7-a57d-545e62b6981e","Type":"ContainerStarted","Data":"13ec6aa6d4ffdedffdb707080bbc0f6583b0cab4a5436d93282645c138d43db0"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.709305 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.715226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" event={"ID":"52210224-8989-4e16-8fdf-4ea3a8211b10","Type":"ContainerStarted","Data":"b1783584712b3e83e4d823314ce43ccfb4ea143bf780290131daa70b1304a4fd"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.716132 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.718246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" event={"ID":"fe4ddc70-f382-4b32-8879-122023b45438","Type":"ContainerStarted","Data":"6831d686167bae994649acf600a2c17990e095119528af9e7ccbc52400b4d021"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.719249 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.728769 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" event={"ID":"b3c037b9-79d2-45ea-9b92-66e50eb20e6b","Type":"ContainerStarted","Data":"0955ee0cf5b2c84c17e46b7dad60c95b464ad8c066a3c63ba1c0bd2926469eee"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.729403 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.734494 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" podStartSLOduration=2.995093318 podStartE2EDuration="24.734480305s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.441952431 +0000 UTC m=+1000.039884960" lastFinishedPulling="2026-03-20 13:47:22.181339408 +0000 UTC m=+1021.779271947" observedRunningTime="2026-03-20 13:47:22.729409918 +0000 UTC m=+1022.327342457" watchObservedRunningTime="2026-03-20 13:47:22.734480305 +0000 UTC m=+1022.332412834" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.745241 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.755743 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" podStartSLOduration=2.7728989029999997 podStartE2EDuration="24.755709043s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.12288109 +0000 UTC m=+999.720813619" lastFinishedPulling="2026-03-20 13:47:22.10569123 +0000 UTC m=+1021.703623759" observedRunningTime="2026-03-20 13:47:22.750419739 +0000 UTC m=+1022.348352268" watchObservedRunningTime="2026-03-20 13:47:22.755709043 +0000 UTC m=+1022.353641572" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.762300 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" event={"ID":"7f51051e-6a90-4582-a411-28a106c37118","Type":"ContainerStarted","Data":"d8ee2702de03b4e28f3a6252433fb6aa3bbe6e0cee5e292c6d0831cbacbeed8d"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.762999 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.772999 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" podStartSLOduration=4.083491819 podStartE2EDuration="24.772985493s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.263612529 +0000 UTC m=+999.861545058" lastFinishedPulling="2026-03-20 13:47:20.953106203 +0000 UTC m=+1020.551038732" observedRunningTime="2026-03-20 13:47:22.76956498 +0000 UTC m=+1022.367497509" watchObservedRunningTime="2026-03-20 13:47:22.772985493 +0000 UTC m=+1022.370918022" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.775724 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" event={"ID":"00fc80a4-4ea8-4f61-8795-6473f0adc40a","Type":"ContainerStarted","Data":"b26ea63fb4e3917cc230ce1f459655861bf280c4e156c1b0785ce4f51952f639"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.776764 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.787882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" event={"ID":"bc80030a-428b-4643-9d8d-2b0e9c873060","Type":"ContainerStarted","Data":"7060ec59cc1e97cb50ff8fab5f93ce9d52d1cf68fb1fe5ba256f5ef9dd6ffed5"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.788773 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.799162 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" event={"ID":"552e0390-e86e-4972-bf6f-a4570e6b6f81","Type":"ContainerStarted","Data":"50ed27fc586ae65beb3715305a9bd491d09e4dff9a3971affeb8b32ab377236e"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.800450 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.810794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" event={"ID":"3a22a8d8-92cd-4177-a597-9c659673392c","Type":"ContainerStarted","Data":"4eb791f1d0a17ac4767e8a95fb5074e97da3c08cf17981e4a762f607691726b9"} Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.811623 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.820398 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" podStartSLOduration=2.932545337 podStartE2EDuration="24.820369662s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.619900006 +0000 UTC m=+999.217832535" lastFinishedPulling="2026-03-20 13:47:21.507724321 +0000 UTC m=+1021.105656860" observedRunningTime="2026-03-20 13:47:22.802711162 +0000 UTC m=+1022.400643691" watchObservedRunningTime="2026-03-20 13:47:22.820369662 +0000 UTC m=+1022.418302201" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.836034 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" podStartSLOduration=3.000441984 podStartE2EDuration="24.836004238s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.446575336 +0000 UTC m=+1000.044507865" lastFinishedPulling="2026-03-20 13:47:22.28213759 +0000 UTC m=+1021.880070119" observedRunningTime="2026-03-20 13:47:22.831906266 +0000 UTC m=+1022.429838795" watchObservedRunningTime="2026-03-20 13:47:22.836004238 +0000 UTC m=+1022.433936767" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.870096 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" podStartSLOduration=2.749760195 podStartE2EDuration="24.870070565s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.985339128 +0000 UTC m=+999.583271657" lastFinishedPulling="2026-03-20 13:47:22.105649488 +0000 UTC m=+1021.703582027" observedRunningTime="2026-03-20 13:47:22.865573132 +0000 UTC m=+1022.463505661" watchObservedRunningTime="2026-03-20 13:47:22.870070565 +0000 UTC m=+1022.468003094" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.904358 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" podStartSLOduration=2.26292405 podStartE2EDuration="24.904332107s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.464295153 +0000 UTC m=+999.062227682" lastFinishedPulling="2026-03-20 13:47:22.10570321 +0000 UTC m=+1021.703635739" observedRunningTime="2026-03-20 13:47:22.900150672 +0000 UTC m=+1022.498083201" watchObservedRunningTime="2026-03-20 13:47:22.904332107 +0000 UTC m=+1022.502264636" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.936235 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" podStartSLOduration=3.648512985 podStartE2EDuration="24.936213764s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.220055273 +0000 UTC m=+999.817987802" lastFinishedPulling="2026-03-20 13:47:21.507756052 +0000 UTC m=+1021.105688581" observedRunningTime="2026-03-20 13:47:22.934503478 +0000 UTC m=+1022.532435997" watchObservedRunningTime="2026-03-20 13:47:22.936213764 +0000 UTC m=+1022.534146293" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.965167 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" podStartSLOduration=3.571256584 podStartE2EDuration="24.965145131s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.113972068 +0000 UTC m=+999.711904597" lastFinishedPulling="2026-03-20 13:47:21.507860605 +0000 UTC m=+1021.105793144" observedRunningTime="2026-03-20 13:47:22.96214332 +0000 UTC m=+1022.560075849" watchObservedRunningTime="2026-03-20 13:47:22.965145131 +0000 UTC m=+1022.563077660" Mar 20 13:47:22 crc kubenswrapper[4755]: I0320 13:47:22.988829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" podStartSLOduration=3.398534864 podStartE2EDuration="24.988805145s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.916911706 +0000 UTC m=+999.514844235" lastFinishedPulling="2026-03-20 13:47:21.507181977 +0000 UTC m=+1021.105114516" observedRunningTime="2026-03-20 13:47:22.985465684 +0000 UTC m=+1022.583398213" watchObservedRunningTime="2026-03-20 13:47:22.988805145 +0000 UTC m=+1022.586737674" Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.027034 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" podStartSLOduration=2.312835588 podStartE2EDuration="25.027012644s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.391509193 +0000 UTC m=+998.989441722" lastFinishedPulling="2026-03-20 13:47:22.105686249 +0000 UTC m=+1021.703618778" observedRunningTime="2026-03-20 13:47:23.021252848 +0000 UTC m=+1022.619185377" watchObservedRunningTime="2026-03-20 13:47:23.027012644 +0000 UTC m=+1022.624945173" Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.822542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" event={"ID":"4c1ba89a-aed6-4245-8411-4d1fecac2500","Type":"ContainerStarted","Data":"e5f4e30e876b9cc4e7fceeaf44b19e4be593da0a7bcdb75531556284eec32389"} Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.824686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" event={"ID":"358d4809-db3b-4468-8c8c-4ffbedc0ec89","Type":"ContainerStarted","Data":"04b4df9a34323cfb67606ee38e56a0376b5b92c846b8222838e32d1b05b256c5"} Mar 20 13:47:23 crc kubenswrapper[4755]: I0320 13:47:23.854030 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" podStartSLOduration=4.041442756 podStartE2EDuration="25.853998603s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.36657469 +0000 UTC m=+999.964507219" lastFinishedPulling="2026-03-20 13:47:22.179130537 +0000 UTC m=+1021.777063066" observedRunningTime="2026-03-20 13:47:23.847808744 +0000 UTC m=+1023.445741293" watchObservedRunningTime="2026-03-20 13:47:23.853998603 +0000 UTC m=+1023.451931132" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.689714 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-ds4tb" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.707957 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pr9d5" Mar 20 13:47:28 crc kubenswrapper[4755]: I0320 13:47:28.865952 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-hxmnd" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.026385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-xd8mk" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.048867 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-cbj27" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.076568 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92fwj" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.145582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gszjd" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.149215 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-2d9hb" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.220721 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.222569 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-r8jn7" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.260989 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-c5nbs" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.335271 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-bfh6x" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.454528 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-hpmzq" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.877688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" event={"ID":"72ea5d65-7221-4b25-9025-7a5c31bae331","Type":"ContainerStarted","Data":"834f8a22d24478ac5b10d12b1369e14e15adff26e22af3f9492365ffb6d826f9"} Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.878092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:47:29 crc kubenswrapper[4755]: I0320 13:47:29.896798 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" podStartSLOduration=3.58105853 podStartE2EDuration="31.896769348s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.356161846 +0000 UTC m=+999.954094375" lastFinishedPulling="2026-03-20 13:47:28.671872664 +0000 UTC m=+1028.269805193" observedRunningTime="2026-03-20 13:47:29.895571036 +0000 UTC m=+1029.493503595" watchObservedRunningTime="2026-03-20 13:47:29.896769348 +0000 UTC m=+1029.494701897" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.486107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.499646 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/83d6120d-b54b-452c-aa8a-026665f1afae-cert\") pod \"infra-operator-controller-manager-7b55fff5bb-sm4wg\" (UID: \"83d6120d-b54b-452c-aa8a-026665f1afae\") " pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.667848 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-l7q5f" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.676392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.896718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:30 crc kubenswrapper[4755]: I0320 13:47:30.905107 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bad91c65-94da-4f8a-addb-21b037197217-cert\") pod \"openstack-baremetal-operator-controller-manager-86657c54f58nx7v\" (UID: \"bad91c65-94da-4f8a-addb-21b037197217\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.069831 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg"] Mar 20 13:47:31 crc kubenswrapper[4755]: W0320 13:47:31.080717 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83d6120d_b54b_452c_aa8a_026665f1afae.slice/crio-3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb WatchSource:0}: Error finding container 3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb: Status 404 returned error can't find the container with id 3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.125868 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-n4kd2" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.129935 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.406931 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.407438 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.413217 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-metrics-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.413598 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/42c9c167-c386-4d60-868c-8b0b63fccbcd-webhook-certs\") pod \"openstack-operator-controller-manager-795d5ff795-ld7m6\" (UID: \"42c9c167-c386-4d60-868c-8b0b63fccbcd\") " pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.567214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v"] Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.615484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-7dlnc" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.622352 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.908504 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6"] Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.914688 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" event={"ID":"bad91c65-94da-4f8a-addb-21b037197217","Type":"ContainerStarted","Data":"3d81e557f89e6ec8ca35b9228ca4108698597194ed24f29a155f62a42056e450"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.915809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" event={"ID":"1aaef0d5-16fe-4c61-82d5-660f29168171","Type":"ContainerStarted","Data":"b0c9a875a201c24b71b411179f6292b2276738811db10345de6d877b34630120"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.916752 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.920112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" event={"ID":"83d6120d-b54b-452c-aa8a-026665f1afae","Type":"ContainerStarted","Data":"3fc93e3b72537e6397ef482114b15b83f5341ac7e4754e85668c935bcbe574cb"} Mar 20 13:47:31 crc kubenswrapper[4755]: I0320 13:47:31.936259 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" podStartSLOduration=2.536412381 podStartE2EDuration="33.936237142s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.285293429 +0000 UTC m=+999.883225958" lastFinishedPulling="2026-03-20 13:47:31.68511819 +0000 UTC m=+1031.283050719" observedRunningTime="2026-03-20 13:47:31.93063335 +0000 UTC m=+1031.528565869" watchObservedRunningTime="2026-03-20 13:47:31.936237142 +0000 UTC m=+1031.534169671" Mar 20 13:47:32 crc kubenswrapper[4755]: W0320 13:47:32.615917 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42c9c167_c386_4d60_868c_8b0b63fccbcd.slice/crio-667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211 WatchSource:0}: Error finding container 667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211: Status 404 returned error can't find the container with id 667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211 Mar 20 13:47:32 crc kubenswrapper[4755]: I0320 13:47:32.933465 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" event={"ID":"42c9c167-c386-4d60-868c-8b0b63fccbcd","Type":"ContainerStarted","Data":"667218bca55aea3e7c27093f3f8508420ebf4b7abeb4b3a54a4691901a4f1211"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.944579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" event={"ID":"42c9c167-c386-4d60-868c-8b0b63fccbcd","Type":"ContainerStarted","Data":"d3610573dcd7197c2967de8d19220fa13ea282ce0acaa0b9586acababdccc496"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.945025 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.947246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" event={"ID":"21c9358d-2c84-4c38-9c91-8ca3dad4dab7","Type":"ContainerStarted","Data":"db826354ebdf6b02c5f701b694b96a03663af0434e028a4e6bac7728a9c56c64"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.947512 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.949966 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" event={"ID":"83d6120d-b54b-452c-aa8a-026665f1afae","Type":"ContainerStarted","Data":"6272366f186751bdf2848316a190c168b3e037536e78d9828e4fb8fc3b6574a8"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.950165 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.952723 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" event={"ID":"26dfba7a-f5fa-45bc-a187-91ddce4da2d6","Type":"ContainerStarted","Data":"d572a9132b41c964880fce2fe841c76c8023675373e0cbe6da6efb6508fe51dd"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.953381 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.955695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" event={"ID":"5a83ca27-3334-4aac-9129-5635d3af0714","Type":"ContainerStarted","Data":"09b0ae48ccfbe35e11690a2786dc6c6457b191d49902845e8b4f81381ca43a9b"} Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.956110 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:47:33 crc kubenswrapper[4755]: I0320 13:47:33.997749 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" podStartSLOduration=1.9098457450000002 podStartE2EDuration="35.997730986s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.619526796 +0000 UTC m=+999.217459315" lastFinishedPulling="2026-03-20 13:47:33.707412027 +0000 UTC m=+1033.305344556" observedRunningTime="2026-03-20 13:47:33.996302116 +0000 UTC m=+1033.594234655" watchObservedRunningTime="2026-03-20 13:47:33.997730986 +0000 UTC m=+1033.595663515" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.002100 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" podStartSLOduration=35.002090544 podStartE2EDuration="35.002090544s" podCreationTimestamp="2026-03-20 13:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:33.980806975 +0000 UTC m=+1033.578739534" watchObservedRunningTime="2026-03-20 13:47:34.002090544 +0000 UTC m=+1033.600023073" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.034023 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" podStartSLOduration=34.308931612 podStartE2EDuration="36.034007643s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:31.083270177 +0000 UTC m=+1030.681202706" lastFinishedPulling="2026-03-20 13:47:32.808346208 +0000 UTC m=+1032.406278737" observedRunningTime="2026-03-20 13:47:34.030617961 +0000 UTC m=+1033.628550500" watchObservedRunningTime="2026-03-20 13:47:34.034007643 +0000 UTC m=+1033.631940172" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.061047 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" podStartSLOduration=3.694472676 podStartE2EDuration="36.061031038s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.441898779 +0000 UTC m=+1000.039831308" lastFinishedPulling="2026-03-20 13:47:32.808457141 +0000 UTC m=+1032.406389670" observedRunningTime="2026-03-20 13:47:34.060632436 +0000 UTC m=+1033.658564965" watchObservedRunningTime="2026-03-20 13:47:34.061031038 +0000 UTC m=+1033.658963567" Mar 20 13:47:34 crc kubenswrapper[4755]: I0320 13:47:34.084546 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" podStartSLOduration=2.813726665 podStartE2EDuration="36.084518137s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.538942354 +0000 UTC m=+999.136874883" lastFinishedPulling="2026-03-20 13:47:32.809733826 +0000 UTC m=+1032.407666355" observedRunningTime="2026-03-20 13:47:34.080274881 +0000 UTC m=+1033.678207410" watchObservedRunningTime="2026-03-20 13:47:34.084518137 +0000 UTC m=+1033.682450666" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.971253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" event={"ID":"88862bd4-c890-447c-b4ee-b9cb1a4928e8","Type":"ContainerStarted","Data":"8d5fc94f6b72f307ce4f9c317f2c9532211b1207a5a58aca33c0afe897df4dc6"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.971844 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.973858 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" event={"ID":"bad91c65-94da-4f8a-addb-21b037197217","Type":"ContainerStarted","Data":"ab8f9c7ca3ee651697a5f40226da3f251c2ca377bfe8241e9903e1197190d33a"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.973941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.976118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" event={"ID":"a1b32bae-fa65-45aa-a8db-b46a7351ee2c","Type":"ContainerStarted","Data":"9dfb913aac9e7cd5ba15e0dc7557d209740f1531aaa5fee49dfa2dda0a6c439f"} Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.976298 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:47:35 crc kubenswrapper[4755]: I0320 13:47:35.998752 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" podStartSLOduration=3.527684228 podStartE2EDuration="37.998722703s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.62167331 +0000 UTC m=+1000.219605839" lastFinishedPulling="2026-03-20 13:47:35.092711785 +0000 UTC m=+1034.690644314" observedRunningTime="2026-03-20 13:47:35.994544144 +0000 UTC m=+1035.592476693" watchObservedRunningTime="2026-03-20 13:47:35.998722703 +0000 UTC m=+1035.596655262" Mar 20 13:47:36 crc kubenswrapper[4755]: I0320 13:47:36.040761 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" podStartSLOduration=34.532481679 podStartE2EDuration="38.040734274s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:31.583344661 +0000 UTC m=+1031.181277190" lastFinishedPulling="2026-03-20 13:47:35.091597256 +0000 UTC m=+1034.689529785" observedRunningTime="2026-03-20 13:47:36.035944929 +0000 UTC m=+1035.633877498" watchObservedRunningTime="2026-03-20 13:47:36.040734274 +0000 UTC m=+1035.638666813" Mar 20 13:47:36 crc kubenswrapper[4755]: I0320 13:47:36.059973 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" podStartSLOduration=3.327256775 podStartE2EDuration="38.059944557s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.361306747 +0000 UTC m=+999.959239276" lastFinishedPulling="2026-03-20 13:47:35.093994529 +0000 UTC m=+1034.691927058" observedRunningTime="2026-03-20 13:47:36.057877963 +0000 UTC m=+1035.655810502" watchObservedRunningTime="2026-03-20 13:47:36.059944557 +0000 UTC m=+1035.657877116" Mar 20 13:47:38 crc kubenswrapper[4755]: I0320 13:47:38.960751 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-f2nbs" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.018558 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-4x5nd" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.167031 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-c8crg" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.299952 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2xgt8" Mar 20 13:47:39 crc kubenswrapper[4755]: I0320 13:47:39.618944 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-4khh5" Mar 20 13:47:40 crc kubenswrapper[4755]: I0320 13:47:40.686839 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b55fff5bb-sm4wg" Mar 20 13:47:41 crc kubenswrapper[4755]: I0320 13:47:41.141958 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-86657c54f58nx7v" Mar 20 13:47:41 crc kubenswrapper[4755]: I0320 13:47:41.638018 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-795d5ff795-ld7m6" Mar 20 13:47:49 crc kubenswrapper[4755]: I0320 13:47:49.357919 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-j7qf2" Mar 20 13:47:49 crc kubenswrapper[4755]: I0320 13:47:49.800170 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-z9px7" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.148482 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.149862 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.153460 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.153873 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.154161 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.167636 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.261112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.362811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.391434 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"auto-csr-approver-29566908-dmw6j\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.498508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:00 crc kubenswrapper[4755]: I0320 13:48:00.996069 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:48:01 crc kubenswrapper[4755]: W0320 13:48:01.006367 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda434c164_9ea6_4062_b8f6_88bb58f41a64.slice/crio-de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6 WatchSource:0}: Error finding container de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6: Status 404 returned error can't find the container with id de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6 Mar 20 13:48:01 crc kubenswrapper[4755]: I0320 13:48:01.202358 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerStarted","Data":"de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6"} Mar 20 13:48:04 crc kubenswrapper[4755]: I0320 13:48:04.221774 4755 generic.go:334] "Generic (PLEG): container finished" podID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerID="52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7" exitCode=0 Mar 20 13:48:04 crc kubenswrapper[4755]: I0320 13:48:04.221860 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerDied","Data":"52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7"} Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.575231 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.651460 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") pod \"a434c164-9ea6-4062-b8f6-88bb58f41a64\" (UID: \"a434c164-9ea6-4062-b8f6-88bb58f41a64\") " Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.658121 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb" (OuterVolumeSpecName: "kube-api-access-rqkwb") pod "a434c164-9ea6-4062-b8f6-88bb58f41a64" (UID: "a434c164-9ea6-4062-b8f6-88bb58f41a64"). InnerVolumeSpecName "kube-api-access-rqkwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:05 crc kubenswrapper[4755]: I0320 13:48:05.753499 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqkwb\" (UniqueName: \"kubernetes.io/projected/a434c164-9ea6-4062-b8f6-88bb58f41a64-kube-api-access-rqkwb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.248961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" event={"ID":"a434c164-9ea6-4062-b8f6-88bb58f41a64","Type":"ContainerDied","Data":"de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6"} Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.249001 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de092be4cc8126721f3e099657e9c6b8b66aae613026c0264f73a0796da377a6" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.249005 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-dmw6j" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.670727 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.675866 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-zh8v6"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836440 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:06 crc kubenswrapper[4755]: E0320 13:48:06.836789 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836808 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.836931 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" containerName="oc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.837634 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-446mv" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839684 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839812 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.839993 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.846411 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.896554 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.898180 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.900756 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.906771 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.973620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974044 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974172 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:06 crc kubenswrapper[4755]: I0320 13:48:06.974486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076514 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076770 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.076967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.077173 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.077616 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.078023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.078332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.101489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"dnsmasq-dns-675f4bcbfc-j9fdk\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.105034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"dnsmasq-dns-78dd6ddcc-qb4hg\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.151737 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.212225 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.247089 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="320783b7-7554-4157-b6cd-143d787dc30b" path="/var/lib/kubelet/pods/320783b7-7554-4157-b6cd-143d787dc30b/volumes" Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.486380 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:07 crc kubenswrapper[4755]: I0320 13:48:07.527145 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:07 crc kubenswrapper[4755]: W0320 13:48:07.536804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31232794_c643_4a0d_a32c_9bcd76b1e121.slice/crio-be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d WatchSource:0}: Error finding container be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d: Status 404 returned error can't find the container with id be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d Mar 20 13:48:08 crc kubenswrapper[4755]: I0320 13:48:08.278093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" event={"ID":"126a2da5-9f66-4125-9d2a-424cbc297bfd","Type":"ContainerStarted","Data":"6fb1b92200cdb6cb763de2b6dc5645838d502f87f1501b1c535570e9f51eb04d"} Mar 20 13:48:08 crc kubenswrapper[4755]: I0320 13:48:08.280851 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" event={"ID":"31232794-c643-4a0d-a32c-9bcd76b1e121","Type":"ContainerStarted","Data":"be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d"} Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.513831 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.536224 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.542863 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.546638 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653636 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.653806 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.755810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.755941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.756188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.757101 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.757188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.787226 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"dnsmasq-dns-5ccc8479f9-f44rr\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.882870 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:09 crc kubenswrapper[4755]: I0320 13:48:09.950710 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.016849 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.024297 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.027437 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.078812 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.078926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.079029 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181325 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181882 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.181947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.182636 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.182941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.204821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"dnsmasq-dns-57d769cc4f-8qk87\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.347058 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.474146 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:10 crc kubenswrapper[4755]: W0320 13:48:10.490187 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d52487_e77f_403a_a60e_af716068e035.slice/crio-17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b WatchSource:0}: Error finding container 17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b: Status 404 returned error can't find the container with id 17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.522545 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.531392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.536912 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537156 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537312 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537407 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537586 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.537737 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.540233 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dj4wr" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.548134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.603512 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691123 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691186 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691763 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691813 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691858 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.691923 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798947 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.798992 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799018 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799059 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.799931 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.803491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.803868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.804383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.806194 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6d21386c-8267-4dba-9028-d5cb729ff78b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.811396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.813411 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6d21386c-8267-4dba-9028-d5cb729ff78b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.815274 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6d21386c-8267-4dba-9028-d5cb729ff78b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.817218 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.820255 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.823793 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bt79h\" (UniqueName: \"kubernetes.io/projected/6d21386c-8267-4dba-9028-d5cb729ff78b-kube-api-access-bt79h\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.848140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"6d21386c-8267-4dba-9028-d5cb729ff78b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.864040 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.924305 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.928929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931221 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931388 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931581 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931764 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-nsxpj" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931790 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.931806 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.932455 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:48:10 crc kubenswrapper[4755]: I0320 13:48:10.940050 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104449 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104492 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104517 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104538 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104566 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104706 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.104731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.105032 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.105104 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.208872 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.208954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209078 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209114 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209181 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209248 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.209314 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.210153 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.210717 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.212127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-server-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.212140 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-config-data\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.213356 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.214249 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.216762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.227926 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.230232 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.233177 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-pod-info\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.234721 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrr2\" (UniqueName: \"kubernetes.io/projected/c2ca344f-8f18-4dd9-9e5c-44669ff2da4f-kube-api-access-8rrr2\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.265005 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f\") " pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.286514 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.317836 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerStarted","Data":"17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b"} Mar 20 13:48:11 crc kubenswrapper[4755]: I0320 13:48:11.319378 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerStarted","Data":"b08752445d4cb4a4b8b6c2c978645cf8d6b89df6eef356585e9cb68a217e3d17"} Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.481339 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.483279 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.487593 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.487714 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.489391 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.491123 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.500593 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-4drl9" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.506884 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641110 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641174 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641213 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.641501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.742942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743054 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743403 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743487 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-generated\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743267 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743846 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743900 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.743941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.744230 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-kolla-config\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.744966 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-operator-scripts\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.747256 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/23ab8e52-0cde-43ec-af8d-24f794695200-config-data-default\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.750355 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.751151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23ab8e52-0cde-43ec-af8d-24f794695200-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.764327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82c5p\" (UniqueName: \"kubernetes.io/projected/23ab8e52-0cde-43ec-af8d-24f794695200-kube-api-access-82c5p\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.780141 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"openstack-galera-0\" (UID: \"23ab8e52-0cde-43ec-af8d-24f794695200\") " pod="openstack/openstack-galera-0" Mar 20 13:48:12 crc kubenswrapper[4755]: I0320 13:48:12.819955 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.805271 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.807814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.814633 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-v75sj" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817426 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.817445 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.830142 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968231 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968273 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.968940 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.969074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:13 crc kubenswrapper[4755]: I0320 13:48:13.969346 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071464 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071588 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071693 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071725 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.071824 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.072378 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.073847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.074277 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.097192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.100076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vcd6\" (UniqueName: \"kubernetes.io/projected/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-kube-api-access-2vcd6\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.102374 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.126118 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.134932 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.136900 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139275 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-bw6d5" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139369 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.139398 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.147085 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276382 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.276750 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379000 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379074 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379137 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.379236 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380245 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-kolla-config\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.380192 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1786d302-95f2-410e-8280-14a89cbaf48c-config-data\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.383530 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-combined-ca-bundle\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.389146 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/1786d302-95f2-410e-8280-14a89cbaf48c-memcached-tls-certs\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.399263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrxpz\" (UniqueName: \"kubernetes.io/projected/1786d302-95f2-410e-8280-14a89cbaf48c-kube-api-access-zrxpz\") pod \"memcached-0\" (UID: \"1786d302-95f2-410e-8280-14a89cbaf48c\") " pod="openstack/memcached-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.456193 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:14 crc kubenswrapper[4755]: I0320 13:48:14.519844 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.463923 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.465293 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.468694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-sjqzk" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.473921 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.623294 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.724839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.746581 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"kube-state-metrics-0\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " pod="openstack/kube-state-metrics-0" Mar 20 13:48:16 crc kubenswrapper[4755]: I0320 13:48:16.789303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.454328 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.455556 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461332 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461494 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-cjqnm" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.461614 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.478213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.490386 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.491895 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.514802 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570555 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570576 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570909 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.570930 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672774 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672883 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672925 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.672989 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673048 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673107 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673166 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673864 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-run\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.673940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-log\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-etc-ovs\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b2961ad5-0d2c-46e9-bb50-2e2893353945-var-lib\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.674369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.676603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-run\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.676862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/408d869f-0966-4908-88e5-37cdff345c4a-var-log-ovn\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.677677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b2961ad5-0d2c-46e9-bb50-2e2893353945-scripts\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.678382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/408d869f-0966-4908-88e5-37cdff345c4a-scripts\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.684778 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-ovn-controller-tls-certs\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.686507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/408d869f-0966-4908-88e5-37cdff345c4a-combined-ca-bundle\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.691942 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdlgt\" (UniqueName: \"kubernetes.io/projected/b2961ad5-0d2c-46e9-bb50-2e2893353945-kube-api-access-wdlgt\") pod \"ovn-controller-ovs-wbxnd\" (UID: \"b2961ad5-0d2c-46e9-bb50-2e2893353945\") " pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.694076 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkhh7\" (UniqueName: \"kubernetes.io/projected/408d869f-0966-4908-88e5-37cdff345c4a-kube-api-access-wkhh7\") pod \"ovn-controller-kbcdp\" (UID: \"408d869f-0966-4908-88e5-37cdff345c4a\") " pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.779448 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.798095 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.799495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803181 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803273 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.803431 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-l8pkm" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.809373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.832916 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876975 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.876994 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.877148 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978512 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978586 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978731 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978760 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.978804 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979258 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.979936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.980151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-config\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.982567 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.984044 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.984728 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:19 crc kubenswrapper[4755]: I0320 13:48:19.997438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bggt8\" (UniqueName: \"kubernetes.io/projected/fed1ecda-4acb-4a4c-a84e-12e58b3ad243-kube-api-access-bggt8\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:20 crc kubenswrapper[4755]: I0320 13:48:19.999836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-sb-0\" (UID: \"fed1ecda-4acb-4a4c-a84e-12e58b3ad243\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:20 crc kubenswrapper[4755]: I0320 13:48:20.128607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:22 crc kubenswrapper[4755]: I0320 13:48:22.594034 4755 scope.go:117] "RemoveContainer" containerID="dfdbdcc4af0ec9266671d4add4df7a76f9886a34d955867696c6f66357f812ac" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.071887 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.074053 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076155 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076189 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.076920 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-dq7gm" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.079210 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.091068 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175276 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.175360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.176074 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.176115 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277527 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277584 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277648 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277697 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277722 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.277796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.278696 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-config\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.280793 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.280996 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.282051 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/de877bb8-b1cd-45de-94c1-5242659fd03e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.285278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.285718 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.291357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/de877bb8-b1cd-45de-94c1-5242659fd03e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.299836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.312038 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gtlq\" (UniqueName: \"kubernetes.io/projected/de877bb8-b1cd-45de-94c1-5242659fd03e-kube-api-access-5gtlq\") pod \"ovsdbserver-nb-0\" (UID: \"de877bb8-b1cd-45de-94c1-5242659fd03e\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:23 crc kubenswrapper[4755]: I0320 13:48:23.399984 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.658599 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.659096 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ndt9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-qb4hg_openstack(31232794-c643-4a0d-a32c-9bcd76b1e121): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.660622 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" podUID="31232794-c643-4a0d-a32c-9bcd76b1e121" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.685311 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.685490 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4tm97,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-j9fdk_openstack(126a2da5-9f66-4125-9d2a-424cbc297bfd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:48:24 crc kubenswrapper[4755]: E0320 13:48:24.686963 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" podUID="126a2da5-9f66-4125-9d2a-424cbc297bfd" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.363806 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.366744 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23ab8e52_0cde_43ec_af8d_24f794695200.slice/crio-e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e WatchSource:0}: Error finding container e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e: Status 404 returned error can't find the container with id e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.376900 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d21386c_8267_4dba_9028_d5cb729ff78b.slice/crio-d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61 WatchSource:0}: Error finding container d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61: Status 404 returned error can't find the container with id d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.377981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.431623 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"e8f413c046f2f1a22bed1b8fa5ddbb2f0a3f5b44e8a8fc18f0d76d96b6c91e1e"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.432673 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"d5654cf1527d32292935383b38c42536e27c474bb4c54a2839f0bfd67bdd1e61"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.434954 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8d52487-e77f-403a-a60e-af716068e035" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" exitCode=0 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.435025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.438683 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerID="3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4" exitCode=0 Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.439436 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4"} Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.528731 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.534675 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.540501 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1786d302_95f2_410e_8280_14a89cbaf48c.slice/crio-15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc WatchSource:0}: Error finding container 15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc: Status 404 returned error can't find the container with id 15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.540568 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.549423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.553804 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1ede4a6_e06a_4084_8ba6_5f1c7f838bbe.slice/crio-901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe WatchSource:0}: Error finding container 901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe: Status 404 returned error can't find the container with id 901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe Mar 20 13:48:25 crc kubenswrapper[4755]: W0320 13:48:25.735962 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfed1ecda_4acb_4a4c_a84e_12e58b3ad243.slice/crio-755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc WatchSource:0}: Error finding container 755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc: Status 404 returned error can't find the container with id 755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.739881 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.770785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp"] Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.842299 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-wbxnd"] Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.849610 4755 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 20 13:48:25 crc kubenswrapper[4755]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:48:25 crc kubenswrapper[4755]: > podSandboxID="17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b" Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.849797 4755 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:48:25 crc kubenswrapper[4755]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7prj7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-f44rr_openstack(b8d52487-e77f-403a-a60e-af716068e035): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 20 13:48:25 crc kubenswrapper[4755]: > logger="UnhandledError" Mar 20 13:48:25 crc kubenswrapper[4755]: E0320 13:48:25.850854 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podUID="b8d52487-e77f-403a-a60e-af716068e035" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.923186 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:25 crc kubenswrapper[4755]: I0320 13:48:25.942429 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031235 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031368 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031494 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") pod \"31232794-c643-4a0d-a32c-9bcd76b1e121\" (UID: \"31232794-c643-4a0d-a32c-9bcd76b1e121\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031535 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") pod \"126a2da5-9f66-4125-9d2a-424cbc297bfd\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.031563 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") pod \"126a2da5-9f66-4125-9d2a-424cbc297bfd\" (UID: \"126a2da5-9f66-4125-9d2a-424cbc297bfd\") " Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032077 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config" (OuterVolumeSpecName: "config") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.032201 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config" (OuterVolumeSpecName: "config") pod "126a2da5-9f66-4125-9d2a-424cbc297bfd" (UID: "126a2da5-9f66-4125-9d2a-424cbc297bfd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.036959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97" (OuterVolumeSpecName: "kube-api-access-4tm97") pod "126a2da5-9f66-4125-9d2a-424cbc297bfd" (UID: "126a2da5-9f66-4125-9d2a-424cbc297bfd"). InnerVolumeSpecName "kube-api-access-4tm97". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.037232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d" (OuterVolumeSpecName: "kube-api-access-ndt9d") pod "31232794-c643-4a0d-a32c-9bcd76b1e121" (UID: "31232794-c643-4a0d-a32c-9bcd76b1e121"). InnerVolumeSpecName "kube-api-access-ndt9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.133793 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134207 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tm97\" (UniqueName: \"kubernetes.io/projected/126a2da5-9f66-4125-9d2a-424cbc297bfd-kube-api-access-4tm97\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134219 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/126a2da5-9f66-4125-9d2a-424cbc297bfd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134297 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/31232794-c643-4a0d-a32c-9bcd76b1e121-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.134308 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndt9d\" (UniqueName: \"kubernetes.io/projected/31232794-c643-4a0d-a32c-9bcd76b1e121-kube-api-access-ndt9d\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.451805 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"901c2e2411d4e6fd0e3180a9357ff2f97754ae4f25ed0be80bcfb17c48f2ebbe"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.453732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"a0e5580e48257aae362f608b62e3bbc6073b0ba593958e3e0816180b07437a59"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.455939 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"71c82861b041c19bf20384296bc113b1205d4d94c31287f2a85b68ac81cef0b4"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.457835 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"755fc833fd8683e288f2daf83c18749a5dfe6cb03aa1bbdc05b22204a0bd40bc"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.459197 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1786d302-95f2-410e-8280-14a89cbaf48c","Type":"ContainerStarted","Data":"15854cb11daaa034d0955018b8bbf8ebb55f93c02cb531e02bc584848f77cdfc"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.460788 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" event={"ID":"31232794-c643-4a0d-a32c-9bcd76b1e121","Type":"ContainerDied","Data":"be243c5902d548486653f9bbf1410fa5553b8572a55786a1b1e8fc190e579d8d"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.460827 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-qb4hg" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.464385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp" event={"ID":"408d869f-0966-4908-88e5-37cdff345c4a","Type":"ContainerStarted","Data":"814c7070b25ab1fe6ca3c907111fcae65c59840a0d99df62c010f837dc326347"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.465789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" event={"ID":"126a2da5-9f66-4125-9d2a-424cbc297bfd","Type":"ContainerDied","Data":"6fb1b92200cdb6cb763de2b6dc5645838d502f87f1501b1c535570e9f51eb04d"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.465829 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-j9fdk" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.469186 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerStarted","Data":"93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.469800 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.472154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerStarted","Data":"1f936cfbd135019d1572ee465a4fb61fade57721a1a7701a47ec15a9bf86c1cd"} Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.520712 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" podStartSLOduration=3.295801188 podStartE2EDuration="17.520690948s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:10.61117297 +0000 UTC m=+1070.209105499" lastFinishedPulling="2026-03-20 13:48:24.83606272 +0000 UTC m=+1084.433995259" observedRunningTime="2026-03-20 13:48:26.495103558 +0000 UTC m=+1086.093036097" watchObservedRunningTime="2026-03-20 13:48:26.520690948 +0000 UTC m=+1086.118623477" Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.554939 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.622025 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-j9fdk"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.632430 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.639370 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-qb4hg"] Mar 20 13:48:26 crc kubenswrapper[4755]: I0320 13:48:26.692508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:48:27 crc kubenswrapper[4755]: I0320 13:48:27.237317 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126a2da5-9f66-4125-9d2a-424cbc297bfd" path="/var/lib/kubelet/pods/126a2da5-9f66-4125-9d2a-424cbc297bfd/volumes" Mar 20 13:48:27 crc kubenswrapper[4755]: I0320 13:48:27.237704 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31232794-c643-4a0d-a32c-9bcd76b1e121" path="/var/lib/kubelet/pods/31232794-c643-4a0d-a32c-9bcd76b1e121/volumes" Mar 20 13:48:30 crc kubenswrapper[4755]: W0320 13:48:30.291467 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde877bb8_b1cd_45de_94c1_5242659fd03e.slice/crio-903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae WatchSource:0}: Error finding container 903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae: Status 404 returned error can't find the container with id 903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.348876 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.400500 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:30 crc kubenswrapper[4755]: I0320 13:48:30.508884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"903573e757ec209a69f0f7dbc84bd2a5b175ca88c0b114462207d3381effdfae"} Mar 20 13:48:36 crc kubenswrapper[4755]: I0320 13:48:36.750841 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:36 crc kubenswrapper[4755]: I0320 13:48:36.751463 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.606883 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.614280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerStarted","Data":"8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.615082 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.621829 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"1786d302-95f2-410e-8280-14a89cbaf48c","Type":"ContainerStarted","Data":"ac8f807fc1fcca9ddc96ee0bd88bc0a23040ab4b9e40a2dd44b106733da238e6"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.622033 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633313 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerStarted","Data":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633606 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" containerID="cri-o://4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" gracePeriod=10 Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.633725 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.651965 4755 generic.go:334] "Generic (PLEG): container finished" podID="b2961ad5-0d2c-46e9-bb50-2e2893353945" containerID="bf4e9e4330683d4ccd79b8f36beff0599ba6add00c8a7a23493065253bae6878" exitCode=0 Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.652041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerDied","Data":"bf4e9e4330683d4ccd79b8f36beff0599ba6add00c8a7a23493065253bae6878"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.658602 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp" event={"ID":"408d869f-0966-4908-88e5-37cdff345c4a","Type":"ContainerStarted","Data":"13af441eb89a42cfa2fddf8e375445e903e8202c57fe131607cbbae52bf16654"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.658703 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-kbcdp" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.671756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"af947e5dc0897b31ec19fa3b48dc856fb157faf9cb5872ca5f78522c8e383953"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.700288 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.812029022 podStartE2EDuration="27.700266002s" podCreationTimestamp="2026-03-20 13:48:14 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.546853262 +0000 UTC m=+1085.144785791" lastFinishedPulling="2026-03-20 13:48:37.435090242 +0000 UTC m=+1097.033022771" observedRunningTime="2026-03-20 13:48:41.663984103 +0000 UTC m=+1101.261916632" watchObservedRunningTime="2026-03-20 13:48:41.700266002 +0000 UTC m=+1101.298198531" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.708749 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.686443541 podStartE2EDuration="25.708722985s" podCreationTimestamp="2026-03-20 13:48:16 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.562988665 +0000 UTC m=+1085.160921194" lastFinishedPulling="2026-03-20 13:48:40.585268099 +0000 UTC m=+1100.183200638" observedRunningTime="2026-03-20 13:48:41.682846096 +0000 UTC m=+1101.280778625" watchObservedRunningTime="2026-03-20 13:48:41.708722985 +0000 UTC m=+1101.306655514" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.714632 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"a0dd7b7fc6c15f4c319fe9bcd82f91e32adc8dc7930b71d2b5f3803ace890852"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.722112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb"} Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.727252 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" podStartSLOduration=18.360279707 podStartE2EDuration="32.727196398s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:10.496997998 +0000 UTC m=+1070.094930527" lastFinishedPulling="2026-03-20 13:48:24.863914689 +0000 UTC m=+1084.461847218" observedRunningTime="2026-03-20 13:48:41.716234001 +0000 UTC m=+1101.314166540" watchObservedRunningTime="2026-03-20 13:48:41.727196398 +0000 UTC m=+1101.325128917" Mar 20 13:48:41 crc kubenswrapper[4755]: I0320 13:48:41.749941 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-kbcdp" podStartSLOduration=8.687077038 podStartE2EDuration="22.749917184s" podCreationTimestamp="2026-03-20 13:48:19 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.774032674 +0000 UTC m=+1085.371965223" lastFinishedPulling="2026-03-20 13:48:39.83687282 +0000 UTC m=+1099.434805369" observedRunningTime="2026-03-20 13:48:41.739126941 +0000 UTC m=+1101.337059480" watchObservedRunningTime="2026-03-20 13:48:41.749917184 +0000 UTC m=+1101.347849713" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.185563 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.359850 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.360278 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.360369 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") pod \"b8d52487-e77f-403a-a60e-af716068e035\" (UID: \"b8d52487-e77f-403a-a60e-af716068e035\") " Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.368815 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7" (OuterVolumeSpecName: "kube-api-access-7prj7") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "kube-api-access-7prj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.401175 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config" (OuterVolumeSpecName: "config") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.402705 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b8d52487-e77f-403a-a60e-af716068e035" (UID: "b8d52487-e77f-403a-a60e-af716068e035"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462118 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7prj7\" (UniqueName: \"kubernetes.io/projected/b8d52487-e77f-403a-a60e-af716068e035-kube-api-access-7prj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462172 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.462185 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b8d52487-e77f-403a-a60e-af716068e035-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735280 4755 generic.go:334] "Generic (PLEG): container finished" podID="b8d52487-e77f-403a-a60e-af716068e035" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" exitCode=0 Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735381 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" event={"ID":"b8d52487-e77f-403a-a60e-af716068e035","Type":"ContainerDied","Data":"17b46a4ab78b734b7f716989dbf59e8c856af6fb527b2b953a9455a0fb874c6b"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735442 4755 scope.go:117] "RemoveContainer" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.735616 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-f44rr" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.749595 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.767980 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"f07f9cde3762a0eea76351152ab9e9747c414ceeede6f6c4913d32a01cfb2e75"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-wbxnd" event={"ID":"b2961ad5-0d2c-46e9-bb50-2e2893353945","Type":"ContainerStarted","Data":"d6047cc707b2bec5222eb333a26a6ecf68c2ecb99d4f55acbe0341219636bd6a"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768393 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.768462 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.780260 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.780891 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="init" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781334 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="init" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.781368 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781377 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.781635 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d52487-e77f-403a-a60e-af716068e035" containerName="dnsmasq-dns" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.783132 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.790109 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.794976 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f"} Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.802819 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.817728 4755 scope.go:117] "RemoveContainer" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.872124 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873753 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.873955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.874028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.878265 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.889285 4755 scope.go:117] "RemoveContainer" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.892064 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": container with ID starting with 4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0 not found: ID does not exist" containerID="4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892101 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0"} err="failed to get container status \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": rpc error: code = NotFound desc = could not find container \"4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0\": container with ID starting with 4984cea7288a2f71966b6466955a5f83c564bde0d5ceea4d826fea6d2c0b31f0 not found: ID does not exist" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892121 4755 scope.go:117] "RemoveContainer" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: E0320 13:48:42.892490 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": container with ID starting with f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a not found: ID does not exist" containerID="f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.892513 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a"} err="failed to get container status \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": rpc error: code = NotFound desc = could not find container \"f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a\": container with ID starting with f755b0a2ea7c140754e647f78fa35f1a9087993b144a9e9f5dd22a29509f3b4a not found: ID does not exist" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.894113 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-f44rr"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.909819 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-wbxnd" podStartSLOduration=9.976301507 podStartE2EDuration="23.909784563s" podCreationTimestamp="2026-03-20 13:48:19 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.903353564 +0000 UTC m=+1085.501286093" lastFinishedPulling="2026-03-20 13:48:39.83683661 +0000 UTC m=+1099.434769149" observedRunningTime="2026-03-20 13:48:42.881483761 +0000 UTC m=+1102.479416300" watchObservedRunningTime="2026-03-20 13:48:42.909784563 +0000 UTC m=+1102.507717082" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.939541 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.941943 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.943893 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975529 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975663 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975726 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975753 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975781 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975854 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.975875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovn-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976444 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3a0e99e7-7429-41a7-bff7-23cafba6b78a-ovs-rundir\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976442 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.976882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3a0e99e7-7429-41a7-bff7-23cafba6b78a-config\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.991110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:42 crc kubenswrapper[4755]: I0320 13:48:42.996540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a0e99e7-7429-41a7-bff7-23cafba6b78a-combined-ca-bundle\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.009623 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjvsr\" (UniqueName: \"kubernetes.io/projected/3a0e99e7-7429-41a7-bff7-23cafba6b78a-kube-api-access-vjvsr\") pod \"ovn-controller-metrics-nmwms\" (UID: \"3a0e99e7-7429-41a7-bff7-23cafba6b78a\") " pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077337 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077418 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077449 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.077470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.081545 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.082945 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.084025 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.108029 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"dnsmasq-dns-6bc7876d45-hd59d\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.128112 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nmwms" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.250011 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d52487-e77f-403a-a60e-af716068e035" path="/var/lib/kubelet/pods/b8d52487-e77f-403a-a60e-af716068e035/volumes" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.250983 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.251361 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.259215 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.263071 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.276116 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.280290 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383724 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383807 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383830 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.383962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486221 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486354 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486387 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486410 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.486437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.487578 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.487909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.489608 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.490859 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.507067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"dnsmasq-dns-8554648995-jb5zr\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:43 crc kubenswrapper[4755]: I0320 13:48:43.603812 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.816179 4755 generic.go:334] "Generic (PLEG): container finished" podID="23ab8e52-0cde-43ec-af8d-24f794695200" containerID="8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e" exitCode=0 Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.816361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerDied","Data":"8e7da803b4de22200ab08a28b849d2b793f22c38ee8203af20f34c671d088b6e"} Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.823722 4755 generic.go:334] "Generic (PLEG): container finished" podID="d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe" containerID="e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb" exitCode=0 Mar 20 13:48:44 crc kubenswrapper[4755]: I0320 13:48:44.823774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerDied","Data":"e9c3540d5249a821a4879cc4bdc6c9cc93aa5919ff7810f73673afeb85a6a2cb"} Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.805171 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a0e99e7_7429_41a7_bff7_23cafba6b78a.slice/crio-f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b WatchSource:0}: Error finding container f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b: Status 404 returned error can't find the container with id f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.805396 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nmwms"] Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.808489 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bec7874_a7ec_4bf9_a716_0bf6bb9563fa.slice/crio-011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb WatchSource:0}: Error finding container 011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb: Status 404 returned error can't find the container with id 011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.812169 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.839826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmwms" event={"ID":"3a0e99e7-7429-41a7-bff7-23cafba6b78a","Type":"ContainerStarted","Data":"f09ddbf8ba6772480556dfd20e36ea181a96ddec243261f55f95673b6057089b"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.841135 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerStarted","Data":"011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.843782 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"fed1ecda-4acb-4a4c-a84e-12e58b3ad243","Type":"ContainerStarted","Data":"fb83634faa1a093a5211224c7e5c8a5272ed30f71d037303f0881e8562abb624"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.846009 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"23ab8e52-0cde-43ec-af8d-24f794695200","Type":"ContainerStarted","Data":"d2dc1b19311c9fbc046bc5201d86d95ee2a2ecb1e4d0343efe49d55d2eb775c9"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.847937 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"de877bb8-b1cd-45de-94c1-5242659fd03e","Type":"ContainerStarted","Data":"b69fac00382c6565cb7d427ec68d14180948f8f53f335adc13978223db89cf79"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.850333 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe","Type":"ContainerStarted","Data":"cc1d3bbf69eaa17d4630b689bbb4a5fbe1d0b3d5011e9e32117acb1017cc80b2"} Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.868980 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.257119624 podStartE2EDuration="27.868960566s" podCreationTimestamp="2026-03-20 13:48:18 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.741076132 +0000 UTC m=+1085.339008661" lastFinishedPulling="2026-03-20 13:48:45.352917074 +0000 UTC m=+1104.950849603" observedRunningTime="2026-03-20 13:48:45.86111255 +0000 UTC m=+1105.459045079" watchObservedRunningTime="2026-03-20 13:48:45.868960566 +0000 UTC m=+1105.466893095" Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.889287 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=8.863543174 podStartE2EDuration="23.889267508s" podCreationTimestamp="2026-03-20 13:48:22 +0000 UTC" firstStartedPulling="2026-03-20 13:48:30.294182046 +0000 UTC m=+1089.892114575" lastFinishedPulling="2026-03-20 13:48:45.31990635 +0000 UTC m=+1104.917838909" observedRunningTime="2026-03-20 13:48:45.886349741 +0000 UTC m=+1105.484282270" watchObservedRunningTime="2026-03-20 13:48:45.889267508 +0000 UTC m=+1105.487200037" Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.925098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.928189 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.96025484 podStartE2EDuration="33.928162037s" podCreationTimestamp="2026-03-20 13:48:12 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.563489468 +0000 UTC m=+1085.161421997" lastFinishedPulling="2026-03-20 13:48:37.531396665 +0000 UTC m=+1097.129329194" observedRunningTime="2026-03-20 13:48:45.922359455 +0000 UTC m=+1105.520291974" watchObservedRunningTime="2026-03-20 13:48:45.928162037 +0000 UTC m=+1105.526094566" Mar 20 13:48:45 crc kubenswrapper[4755]: W0320 13:48:45.940194 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb WatchSource:0}: Error finding container a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb: Status 404 returned error can't find the container with id a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb Mar 20 13:48:45 crc kubenswrapper[4755]: I0320 13:48:45.957270 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=20.92445026 podStartE2EDuration="34.957242869s" podCreationTimestamp="2026-03-20 13:48:11 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.369482075 +0000 UTC m=+1084.967414614" lastFinishedPulling="2026-03-20 13:48:39.402274704 +0000 UTC m=+1099.000207223" observedRunningTime="2026-03-20 13:48:45.94507519 +0000 UTC m=+1105.543007739" watchObservedRunningTime="2026-03-20 13:48:45.957242869 +0000 UTC m=+1105.555175398" Mar 20 13:48:46 crc kubenswrapper[4755]: E0320 13:48:46.282214 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8e3517b_eb0a_41ae_8dde_78040dd4088e.slice/crio-conmon-11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.797682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.864278 4755 generic.go:334] "Generic (PLEG): container finished" podID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" exitCode=0 Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.864407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871018 4755 generic.go:334] "Generic (PLEG): container finished" podID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerID="11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97" exitCode=0 Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerDied","Data":"11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.871146 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerStarted","Data":"a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.886096 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nmwms" event={"ID":"3a0e99e7-7429-41a7-bff7-23cafba6b78a","Type":"ContainerStarted","Data":"f6294141f948778e685f9c991c03d03c378e3300d2b33d4ffcdb6e4693d6c079"} Mar 20 13:48:46 crc kubenswrapper[4755]: I0320 13:48:46.960578 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nmwms" podStartSLOduration=4.960553426 podStartE2EDuration="4.960553426s" podCreationTimestamp="2026-03-20 13:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:46.950601565 +0000 UTC m=+1106.548534094" watchObservedRunningTime="2026-03-20 13:48:46.960553426 +0000 UTC m=+1106.558485955" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.129189 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.172289 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.289454 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.366261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.366385 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.367012 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.367078 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") pod \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\" (UID: \"d8e3517b-eb0a-41ae-8dde-78040dd4088e\") " Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.375130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4" (OuterVolumeSpecName: "kube-api-access-x5md4") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "kube-api-access-x5md4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.385805 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config" (OuterVolumeSpecName: "config") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.401159 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.403421 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.409397 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8e3517b-eb0a-41ae-8dde-78040dd4088e" (UID: "d8e3517b-eb0a-41ae-8dde-78040dd4088e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.459558 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471081 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471114 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5md4\" (UniqueName: \"kubernetes.io/projected/d8e3517b-eb0a-41ae-8dde-78040dd4088e-kube-api-access-x5md4\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471125 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.471134 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8e3517b-eb0a-41ae-8dde-78040dd4088e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.903322 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerStarted","Data":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.903712 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.906616 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" event={"ID":"d8e3517b-eb0a-41ae-8dde-78040dd4088e","Type":"ContainerDied","Data":"a052631a86fd2d2832d1120e0754d099fde2bd4911b11faa38f140f98ad053eb"} Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.906803 4755 scope.go:117] "RemoveContainer" containerID="11953e1818a1152dd6feb8790dfb45c65cb6451e6585728a123c4f0436b8ad97" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.907063 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc7876d45-hd59d" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.908190 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.908223 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: E0320 13:48:47.911806 4755 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.181:59796->38.102.83.181:38787: read tcp 38.102.83.181:59796->38.102.83.181:38787: read: connection reset by peer Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.954070 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-jb5zr" podStartSLOduration=4.954045956 podStartE2EDuration="4.954045956s" podCreationTimestamp="2026-03-20 13:48:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:47.939766172 +0000 UTC m=+1107.537698711" watchObservedRunningTime="2026-03-20 13:48:47.954045956 +0000 UTC m=+1107.551978675" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.966394 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:48:47 crc kubenswrapper[4755]: I0320 13:48:47.976806 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.074182 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.081560 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc7876d45-hd59d"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.324581 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:48 crc kubenswrapper[4755]: E0320 13:48:48.325029 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.325052 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.325266 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" containerName="init" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.326212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.329893 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qqrf6" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.330088 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.330129 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.335765 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.346440 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.391946 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.391987 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392034 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392055 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392180 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.392339 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493686 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493764 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.493885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494010 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.494140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-scripts\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.495991 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1bdd912-fe33-4449-aed8-12a5ee09961e-config\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.499440 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.509774 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.511115 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d1bdd912-fe33-4449-aed8-12a5ee09961e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.512968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4d72m\" (UniqueName: \"kubernetes.io/projected/d1bdd912-fe33-4449-aed8-12a5ee09961e-kube-api-access-4d72m\") pod \"ovn-northd-0\" (UID: \"d1bdd912-fe33-4449-aed8-12a5ee09961e\") " pod="openstack/ovn-northd-0" Mar 20 13:48:48 crc kubenswrapper[4755]: I0320 13:48:48.647353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.133822 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:48:49 crc kubenswrapper[4755]: W0320 13:48:49.135627 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1bdd912_fe33_4449_aed8_12a5ee09961e.slice/crio-8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63 WatchSource:0}: Error finding container 8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63: Status 404 returned error can't find the container with id 8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63 Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.236193 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8e3517b-eb0a-41ae-8dde-78040dd4088e" path="/var/lib/kubelet/pods/d8e3517b-eb0a-41ae-8dde-78040dd4088e/volumes" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.521944 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:48:49 crc kubenswrapper[4755]: I0320 13:48:49.935842 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"8c3e766d9d2a4045776c7ad15b7e7499df157c7414a3eac20a890db77693bb63"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.954687 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"57f8bc7ef5c60aaea9ff6f9d76059002e2b138b6ca44c5c979810e197fb70ce1"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.955118 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"d1bdd912-fe33-4449-aed8-12a5ee09961e","Type":"ContainerStarted","Data":"3a074e42c4e9a527a8f95be94d47cd42be948d4db4396fea9ffc29303c88cb72"} Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.957012 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:48:50 crc kubenswrapper[4755]: I0320 13:48:50.995590 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.91711162 podStartE2EDuration="2.995571637s" podCreationTimestamp="2026-03-20 13:48:48 +0000 UTC" firstStartedPulling="2026-03-20 13:48:49.138474979 +0000 UTC m=+1108.736407518" lastFinishedPulling="2026-03-20 13:48:50.216934986 +0000 UTC m=+1109.814867535" observedRunningTime="2026-03-20 13:48:50.984488406 +0000 UTC m=+1110.582420935" watchObservedRunningTime="2026-03-20 13:48:50.995571637 +0000 UTC m=+1110.593504156" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.820923 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.821481 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:48:52 crc kubenswrapper[4755]: I0320 13:48:52.945871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.084747 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.606022 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.695352 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:53 crc kubenswrapper[4755]: I0320 13:48:53.697422 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" containerID="cri-o://93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" gracePeriod=10 Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.457379 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.457499 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.551022 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.987386 4755 generic.go:334] "Generic (PLEG): container finished" podID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerID="93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" exitCode=0 Mar 20 13:48:54 crc kubenswrapper[4755]: I0320 13:48:54.987418 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74"} Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.090219 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.307821 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433404 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433473 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.433550 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") pod \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\" (UID: \"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664\") " Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.462239 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4" (OuterVolumeSpecName: "kube-api-access-cmdb4") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "kube-api-access-cmdb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:55 crc kubenswrapper[4755]: E0320 13:48:55.470807 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470827 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: E0320 13:48:55.470923 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="init" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.470934 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="init" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.471134 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" containerName="dnsmasq-dns" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.471593 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.474399 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.481609 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.490436 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config" (OuterVolumeSpecName: "config") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.500616 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" (UID: "3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.524283 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.525446 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.535887 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.535974 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536037 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536051 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536063 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmdb4\" (UniqueName: \"kubernetes.io/projected/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664-kube-api-access-cmdb4\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.536248 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637746 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637902 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.637926 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.638771 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.664515 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"keystone-8e2f-account-create-update-cvvh2\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.696372 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.697929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.709427 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.739616 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.739718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.740684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.758301 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"keystone-db-create-g2hvs\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.795285 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.796577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.798804 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.805962 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.841134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.841260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.873339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.878219 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943577 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.943834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.944706 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:55 crc kubenswrapper[4755]: I0320 13:48:55.967296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"placement-db-create-ng8vm\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.011520 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.012080 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8qk87" event={"ID":"3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664","Type":"ContainerDied","Data":"b08752445d4cb4a4b8b6c2c978645cf8d6b89df6eef356585e9cb68a217e3d17"} Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.012126 4755 scope.go:117] "RemoveContainer" containerID="93b1018666de90b210c22140b3fd383bf8526435644dbd66ed7aa3328bd21d74" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.015574 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.045259 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.045348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.046033 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.047227 4755 scope.go:117] "RemoveContainer" containerID="3fd02a266f1ee85022a2ecd91d222ecf18e5e831fe110e3e465855de38b0e3d4" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.076121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"placement-9157-account-create-update-q8r48\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.081484 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.100534 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8qk87"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.111625 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.353353 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79c00857_0d6a_4c12_8581_da16e2a24f04.slice/crio-fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034 WatchSource:0}: Error finding container fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034: Status 404 returned error can't find the container with id fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.356915 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.404286 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.405785 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0587eb58_cd5e_4e0b_be30_97e0a569fc57.slice/crio-cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69 WatchSource:0}: Error finding container cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69: Status 404 returned error can't find the container with id cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.428974 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.433743 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0795b626_b382_4b9b_beb5_802cebc4f764.slice/crio-92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af WatchSource:0}: Error finding container 92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af: Status 404 returned error can't find the container with id 92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.510289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:48:56 crc kubenswrapper[4755]: W0320 13:48:56.515443 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2af42784_d5cc_4f7c_832a_f91dbd54cc3f.slice/crio-ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4 WatchSource:0}: Error finding container ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4: Status 404 returned error can't find the container with id ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4 Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.872210 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.873607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.940026 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.965955 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966056 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:56 crc kubenswrapper[4755]: I0320 13:48:56.966146 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.027974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerStarted","Data":"80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.028018 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerStarted","Data":"ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.037871 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerStarted","Data":"9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.037919 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerStarted","Data":"fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.041356 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerStarted","Data":"0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.041398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerStarted","Data":"cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.043711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerStarted","Data":"674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.043743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerStarted","Data":"92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af"} Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.051228 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-ng8vm" podStartSLOduration=2.051210608 podStartE2EDuration="2.051210608s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.047388588 +0000 UTC m=+1116.645321117" watchObservedRunningTime="2026-03-20 13:48:57.051210608 +0000 UTC m=+1116.649143137" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.066636 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-g2hvs" podStartSLOduration=2.066615642 podStartE2EDuration="2.066615642s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.062134334 +0000 UTC m=+1116.660066863" watchObservedRunningTime="2026-03-20 13:48:57.066615642 +0000 UTC m=+1116.664548171" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068041 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068098 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068176 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.068205 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.069135 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.069640 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.070518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.071193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.087441 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-9157-account-create-update-q8r48" podStartSLOduration=2.087417926 podStartE2EDuration="2.087417926s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.083202637 +0000 UTC m=+1116.681135176" watchObservedRunningTime="2026-03-20 13:48:57.087417926 +0000 UTC m=+1116.685350465" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.091415 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"dnsmasq-dns-b8fbc5445-6mzzb\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.104867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8e2f-account-create-update-cvvh2" podStartSLOduration=2.104847533 podStartE2EDuration="2.104847533s" podCreationTimestamp="2026-03-20 13:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:57.098950029 +0000 UTC m=+1116.696882558" watchObservedRunningTime="2026-03-20 13:48:57.104847533 +0000 UTC m=+1116.702780072" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.191257 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.240132 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664" path="/var/lib/kubelet/pods/3b17dc4b-4fa8-4e25-8ed7-9c4c1b1fd664/volumes" Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.624289 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:48:57 crc kubenswrapper[4755]: I0320 13:48:57.998035 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.034232 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038306 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038350 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.038491 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-lxknx" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.040915 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.046078 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.071958 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerID="1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.072033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.072947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerStarted","Data":"267e7baeb6290269d8531900c4aac9bc633ebbbaa20000911465b29c50a00f91"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.075599 4755 generic.go:334] "Generic (PLEG): container finished" podID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerID="0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.075678 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerDied","Data":"0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.077413 4755 generic.go:334] "Generic (PLEG): container finished" podID="0795b626-b382-4b9b-beb5-802cebc4f764" containerID="674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.077463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerDied","Data":"674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085376 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085461 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085522 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085589 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.085733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.086443 4755 generic.go:334] "Generic (PLEG): container finished" podID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerID="80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.086610 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerDied","Data":"80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.090298 4755 generic.go:334] "Generic (PLEG): container finished" podID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerID="9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.090330 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerDied","Data":"9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955"} Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.187860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.188331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188860 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188899 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.188947 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:48:58.688927607 +0000 UTC m=+1118.286860146 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.189451 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.189677 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-cache\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.190283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/70300053-7713-4d2c-8e59-a123e9f0f189-lock\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.192570 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/70300053-7713-4d2c-8e59-a123e9f0f189-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.209724 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkms8\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-kube-api-access-bkms8\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.213376 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.497969 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.499901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.501939 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.502279 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.502872 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.511933 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594731 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594788 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594863 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.594895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595213 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.595328 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697440 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697468 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697510 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697558 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697679 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697727 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.697909 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698366 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698401 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: E0320 13:48:58.698466 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:48:59.698443816 +0000 UTC m=+1119.296376365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.698733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.698769 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.703809 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.709627 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.717039 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.720573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"swift-ring-rebalance-j55xs\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:58 crc kubenswrapper[4755]: I0320 13:48:58.832893 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.097622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerStarted","Data":"0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141"} Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.100108 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.130876 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podStartSLOduration=3.130847056 podStartE2EDuration="3.130847056s" podCreationTimestamp="2026-03-20 13:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:59.125191068 +0000 UTC m=+1118.723123607" watchObservedRunningTime="2026-03-20 13:48:59.130847056 +0000 UTC m=+1118.728779585" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.304008 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-j55xs"] Mar 20 13:48:59 crc kubenswrapper[4755]: W0320 13:48:59.323262 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc13f5042_e5e5_47a3_bc96_b504a0bf9af2.slice/crio-fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5 WatchSource:0}: Error finding container fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5: Status 404 returned error can't find the container with id fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5 Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.500136 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.620678 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") pod \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.620832 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") pod \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\" (UID: \"2af42784-d5cc-4f7c-832a-f91dbd54cc3f\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.622296 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2af42784-d5cc-4f7c-832a-f91dbd54cc3f" (UID: "2af42784-d5cc-4f7c-832a-f91dbd54cc3f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.628049 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb" (OuterVolumeSpecName: "kube-api-access-p5vhb") pod "2af42784-d5cc-4f7c-832a-f91dbd54cc3f" (UID: "2af42784-d5cc-4f7c-832a-f91dbd54cc3f"). InnerVolumeSpecName "kube-api-access-p5vhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.706900 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.708553 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.708574 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.708764 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.709258 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.723069 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723484 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723593 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.723666 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:01.723633197 +0000 UTC m=+1121.321565726 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.724025 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.724043 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5vhb\" (UniqueName: \"kubernetes.io/projected/2af42784-d5cc-4f7c-832a-f91dbd54cc3f-kube-api-access-p5vhb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.728950 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.730075 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.740351 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.745714 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") pod \"79c00857-0d6a-4c12-8581-da16e2a24f04\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825637 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") pod \"0795b626-b382-4b9b-beb5-802cebc4f764\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825704 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") pod \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825727 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") pod \"79c00857-0d6a-4c12-8581-da16e2a24f04\" (UID: \"79c00857-0d6a-4c12-8581-da16e2a24f04\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.825754 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") pod \"0795b626-b382-4b9b-beb5-802cebc4f764\" (UID: \"0795b626-b382-4b9b-beb5-802cebc4f764\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826024 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") pod \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\" (UID: \"0587eb58-cd5e-4e0b-be30-97e0a569fc57\") " Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826340 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "79c00857-0d6a-4c12-8581-da16e2a24f04" (UID: "79c00857-0d6a-4c12-8581-da16e2a24f04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826361 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0795b626-b382-4b9b-beb5-802cebc4f764" (UID: "0795b626-b382-4b9b-beb5-802cebc4f764"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826402 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0587eb58-cd5e-4e0b-be30-97e0a569fc57" (UID: "0587eb58-cd5e-4e0b-be30-97e0a569fc57"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826424 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826580 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/79c00857-0d6a-4c12-8581-da16e2a24f04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826592 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0795b626-b382-4b9b-beb5-802cebc4f764-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.826603 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0587eb58-cd5e-4e0b-be30-97e0a569fc57-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.829059 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz" (OuterVolumeSpecName: "kube-api-access-vnzpz") pod "0587eb58-cd5e-4e0b-be30-97e0a569fc57" (UID: "0587eb58-cd5e-4e0b-be30-97e0a569fc57"). InnerVolumeSpecName "kube-api-access-vnzpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.829645 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6" (OuterVolumeSpecName: "kube-api-access-g9vg6") pod "0795b626-b382-4b9b-beb5-802cebc4f764" (UID: "0795b626-b382-4b9b-beb5-802cebc4f764"). InnerVolumeSpecName "kube-api-access-g9vg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.831866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9" (OuterVolumeSpecName: "kube-api-access-zmsc9") pod "79c00857-0d6a-4c12-8581-da16e2a24f04" (UID: "79c00857-0d6a-4c12-8581-da16e2a24f04"). InnerVolumeSpecName "kube-api-access-zmsc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865112 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865410 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865427 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865458 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865466 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: E0320 13:48:59.865481 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865487 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865633 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865664 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" containerName="mariadb-database-create" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.865673 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" containerName="mariadb-account-create-update" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.866133 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.874081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.905781 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.927884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.927969 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928026 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928075 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928162 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9vg6\" (UniqueName: \"kubernetes.io/projected/0795b626-b382-4b9b-beb5-802cebc4f764-kube-api-access-g9vg6\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928194 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnzpz\" (UniqueName: \"kubernetes.io/projected/0587eb58-cd5e-4e0b-be30-97e0a569fc57-kube-api-access-vnzpz\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928206 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmsc9\" (UniqueName: \"kubernetes.io/projected/79c00857-0d6a-4c12-8581-da16e2a24f04-kube-api-access-zmsc9\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.928865 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:48:59 crc kubenswrapper[4755]: I0320 13:48:59.943687 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"glance-db-create-pg2bq\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.029571 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.029724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.030336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.045681 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"glance-35fe-account-create-update-h6fl8\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.065076 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8e2f-account-create-update-cvvh2" event={"ID":"79c00857-0d6a-4c12-8581-da16e2a24f04","Type":"ContainerDied","Data":"fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108404 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb984e35e9681ba2fa2616cd59d8c65343be17474774064f88745675df390034" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.108495 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8e2f-account-create-update-cvvh2" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126246 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-9157-account-create-update-q8r48" event={"ID":"0587eb58-cd5e-4e0b-be30-97e0a569fc57","Type":"ContainerDied","Data":"cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126308 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd652b51d9f2d9dd355d1e5b0d59a7abcca1a7313a61036bc9f76808baff6a69" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.126406 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-9157-account-create-update-q8r48" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.132732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerStarted","Data":"fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138355 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-g2hvs" event={"ID":"0795b626-b382-4b9b-beb5-802cebc4f764","Type":"ContainerDied","Data":"92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138392 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92038c4d7f7896835d49eff73f141484795bb3c21fa9ff835393a2a5904d62af" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.138415 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-g2hvs" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143826 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-ng8vm" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-ng8vm" event={"ID":"2af42784-d5cc-4f7c-832a-f91dbd54cc3f","Type":"ContainerDied","Data":"ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4"} Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.143875 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef4efa5fd567d9bf67c22963707517681fbda783676ce7987c254346770249f4" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.198406 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.410352 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:49:00 crc kubenswrapper[4755]: I0320 13:49:00.698393 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:49:00 crc kubenswrapper[4755]: W0320 13:49:00.705351 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46d041c2_e231_49fd_9d88_a991a1b9dd65.slice/crio-730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103 WatchSource:0}: Error finding container 730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103: Status 404 returned error can't find the container with id 730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103 Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.156139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerStarted","Data":"730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103"} Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.157955 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerStarted","Data":"de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1"} Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.447769 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.448967 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.452924 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.477012 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.561563 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.561726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.666033 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.666307 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.668513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.689990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"root-account-create-update-9jkxf\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.768409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768701 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768845 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: E0320 13:49:01.768929 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:05.768890323 +0000 UTC m=+1125.366822862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:49:01 crc kubenswrapper[4755]: I0320 13:49:01.781126 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:02 crc kubenswrapper[4755]: I0320 13:49:02.261737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:02 crc kubenswrapper[4755]: W0320 13:49:02.264882 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd1b4f7f_9951_4976_b4e6_3222cc1ac6a2.slice/crio-35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0 WatchSource:0}: Error finding container 35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0: Status 404 returned error can't find the container with id 35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0 Mar 20 13:49:03 crc kubenswrapper[4755]: I0320 13:49:03.218556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerStarted","Data":"35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.230946 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerStarted","Data":"d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.234584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerStarted","Data":"618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.236432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerStarted","Data":"ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3"} Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.264492 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-9jkxf" podStartSLOduration=3.264466559 podStartE2EDuration="3.264466559s" podCreationTimestamp="2026-03-20 13:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.257730103 +0000 UTC m=+1123.855662652" watchObservedRunningTime="2026-03-20 13:49:04.264466559 +0000 UTC m=+1123.862399118" Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.277470 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-35fe-account-create-update-h6fl8" podStartSLOduration=5.277444749 podStartE2EDuration="5.277444749s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.271235507 +0000 UTC m=+1123.869168066" watchObservedRunningTime="2026-03-20 13:49:04.277444749 +0000 UTC m=+1123.875377278" Mar 20 13:49:04 crc kubenswrapper[4755]: I0320 13:49:04.291842 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-pg2bq" podStartSLOduration=5.291821406 podStartE2EDuration="5.291821406s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:04.286848015 +0000 UTC m=+1123.884780544" watchObservedRunningTime="2026-03-20 13:49:04.291821406 +0000 UTC m=+1123.889753945" Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.245860 4755 generic.go:334] "Generic (PLEG): container finished" podID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerID="ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.245950 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerDied","Data":"ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.248166 4755 generic.go:334] "Generic (PLEG): container finished" podID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerID="d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.248263 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerDied","Data":"d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.250450 4755 generic.go:334] "Generic (PLEG): container finished" podID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerID="618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810" exitCode=0 Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.250491 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerDied","Data":"618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810"} Mar 20 13:49:05 crc kubenswrapper[4755]: I0320 13:49:05.858121 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.858314 4755 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.859128 4755 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:49:05 crc kubenswrapper[4755]: E0320 13:49:05.859335 4755 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift podName:70300053-7713-4d2c-8e59-a123e9f0f189 nodeName:}" failed. No retries permitted until 2026-03-20 13:49:13.859302234 +0000 UTC m=+1133.457234803 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift") pod "swift-storage-0" (UID: "70300053-7713-4d2c-8e59-a123e9f0f189") : configmap "swift-ring-files" not found Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.757734 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.758547 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.779244 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.781759 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") pod \"46d041c2-e231-49fd-9d88-a991a1b9dd65\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.781843 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") pod \"46d041c2-e231-49fd-9d88-a991a1b9dd65\" (UID: \"46d041c2-e231-49fd-9d88-a991a1b9dd65\") " Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.782598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "46d041c2-e231-49fd-9d88-a991a1b9dd65" (UID: "46d041c2-e231-49fd-9d88-a991a1b9dd65"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.789104 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh" (OuterVolumeSpecName: "kube-api-access-cpsbh") pod "46d041c2-e231-49fd-9d88-a991a1b9dd65" (UID: "46d041c2-e231-49fd-9d88-a991a1b9dd65"). InnerVolumeSpecName "kube-api-access-cpsbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.884534 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/46d041c2-e231-49fd-9d88-a991a1b9dd65-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:06 crc kubenswrapper[4755]: I0320 13:49:06.884570 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpsbh\" (UniqueName: \"kubernetes.io/projected/46d041c2-e231-49fd-9d88-a991a1b9dd65-kube-api-access-cpsbh\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.119199 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.126071 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.188620 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") pod \"6fe77db3-29ef-42ae-840b-9736f07188ca\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.188895 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") pod \"6fe77db3-29ef-42ae-840b-9736f07188ca\" (UID: \"6fe77db3-29ef-42ae-840b-9736f07188ca\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189006 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") pod \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189063 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") pod \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\" (UID: \"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189791 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6fe77db3-29ef-42ae-840b-9736f07188ca" (UID: "6fe77db3-29ef-42ae-840b-9736f07188ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.189808 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" (UID: "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.192938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.195155 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt" (OuterVolumeSpecName: "kube-api-access-tcgjt") pod "6fe77db3-29ef-42ae-840b-9736f07188ca" (UID: "6fe77db3-29ef-42ae-840b-9736f07188ca"). InnerVolumeSpecName "kube-api-access-tcgjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.195996 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6" (OuterVolumeSpecName: "kube-api-access-bdld6") pod "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" (UID: "dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2"). InnerVolumeSpecName "kube-api-access-bdld6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.276834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-pg2bq" event={"ID":"6fe77db3-29ef-42ae-840b-9736f07188ca","Type":"ContainerDied","Data":"de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277686 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de372f04231662367647362de19336ae03c2d0702c48364546867341f54f7cc1" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.277837 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-jb5zr" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" containerID="cri-o://ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" gracePeriod=10 Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.278166 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-pg2bq" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-35fe-account-create-update-h6fl8" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-35fe-account-create-update-h6fl8" event={"ID":"46d041c2-e231-49fd-9d88-a991a1b9dd65","Type":"ContainerDied","Data":"730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.281866 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="730662e922f393033879a1c98aaa969cb69f00e3c4ad2bbe2beaf8272472f103" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285692 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9jkxf" event={"ID":"dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2","Type":"ContainerDied","Data":"35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285730 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35d69895bfd58f4a8d2b0e137a84401df9c8ca2e876891c20d6da56d4e5c41e0" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.285796 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9jkxf" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.289048 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerStarted","Data":"dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc"} Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290681 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcgjt\" (UniqueName: \"kubernetes.io/projected/6fe77db3-29ef-42ae-840b-9736f07188ca-kube-api-access-tcgjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290710 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6fe77db3-29ef-42ae-840b-9736f07188ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290720 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.290729 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdld6\" (UniqueName: \"kubernetes.io/projected/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2-kube-api-access-bdld6\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.326849 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-j55xs" podStartSLOduration=1.717706781 podStartE2EDuration="9.326823975s" podCreationTimestamp="2026-03-20 13:48:58 +0000 UTC" firstStartedPulling="2026-03-20 13:48:59.326190413 +0000 UTC m=+1118.924122942" lastFinishedPulling="2026-03-20 13:49:06.935307607 +0000 UTC m=+1126.533240136" observedRunningTime="2026-03-20 13:49:07.316191857 +0000 UTC m=+1126.914124406" watchObservedRunningTime="2026-03-20 13:49:07.326823975 +0000 UTC m=+1126.924756514" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.783590 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.798035 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.800061 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9jkxf"] Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.901835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.901996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902046 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.902178 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") pod \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\" (UID: \"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa\") " Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.910978 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr" (OuterVolumeSpecName: "kube-api-access-l4lhr") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "kube-api-access-l4lhr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.943575 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.947968 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.953013 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config" (OuterVolumeSpecName: "config") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:07 crc kubenswrapper[4755]: I0320 13:49:07.956985 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" (UID: "1bec7874-a7ec-4bf9-a716-0bf6bb9563fa"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004138 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004169 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4lhr\" (UniqueName: \"kubernetes.io/projected/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-kube-api-access-l4lhr\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004179 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004190 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.004198 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298324 4755 generic.go:334] "Generic (PLEG): container finished" podID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" exitCode=0 Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298414 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-jb5zr" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298416 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298479 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-jb5zr" event={"ID":"1bec7874-a7ec-4bf9-a716-0bf6bb9563fa","Type":"ContainerDied","Data":"011734e48bcfa5f9a2e865cd91738a096622143b2d5728ea40ad7b4f45265deb"} Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.298509 4755 scope.go:117] "RemoveContainer" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.318968 4755 scope.go:117] "RemoveContainer" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.359571 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.366084 4755 scope.go:117] "RemoveContainer" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.366714 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-jb5zr"] Mar 20 13:49:08 crc kubenswrapper[4755]: E0320 13:49:08.367008 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": container with ID starting with ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24 not found: ID does not exist" containerID="ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367066 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24"} err="failed to get container status \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": rpc error: code = NotFound desc = could not find container \"ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24\": container with ID starting with ba0f16fc34f33d80c970a43e821c548e8d78741da401d3300f841bf8da655a24 not found: ID does not exist" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367104 4755 scope.go:117] "RemoveContainer" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: E0320 13:49:08.367587 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": container with ID starting with d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367 not found: ID does not exist" containerID="d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.367633 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367"} err="failed to get container status \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": rpc error: code = NotFound desc = could not find container \"d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367\": container with ID starting with d3f73d04f251f8929032abf4347a93b9186ae8284a8d112355ca60a818ec6367 not found: ID does not exist" Mar 20 13:49:08 crc kubenswrapper[4755]: I0320 13:49:08.722296 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.237757 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" path="/var/lib/kubelet/pods/1bec7874-a7ec-4bf9-a716-0bf6bb9563fa/volumes" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.239104 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" path="/var/lib/kubelet/pods/dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2/volumes" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.970947 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971251 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971263 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971278 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971285 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971301 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971309 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971318 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971324 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: E0320 13:49:09.971345 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="init" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971352 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="init" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971496 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bec7874-a7ec-4bf9-a716-0bf6bb9563fa" containerName="dnsmasq-dns" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971507 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd1b4f7f-9951-4976-b4e6-3222cc1ac6a2" containerName="mariadb-account-create-update" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.971526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" containerName="mariadb-database-create" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.972047 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.975973 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.985807 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:49:09 crc kubenswrapper[4755]: I0320 13:49:09.989214 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.143962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.144011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245040 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245138 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.245257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.250170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.250338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.263290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.268836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"glance-db-sync-w78rr\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.314468 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:10 crc kubenswrapper[4755]: I0320 13:49:10.861642 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.332143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerStarted","Data":"035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499"} Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.472478 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.473958 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.478606 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.483572 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.581248 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.581323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.682757 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.682834 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.683702 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.710813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"root-account-create-update-qhmnv\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:11 crc kubenswrapper[4755]: I0320 13:49:11.805686 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:12 crc kubenswrapper[4755]: I0320 13:49:12.960811 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.352342 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerStarted","Data":"cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.352385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerStarted","Data":"1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.355558 4755 generic.go:334] "Generic (PLEG): container finished" podID="c2ca344f-8f18-4dd9-9e5c-44669ff2da4f" containerID="8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d" exitCode=0 Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.355622 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerDied","Data":"8e1766906106b58ad71f899855dae6854781edd4e85af35469d4a6541e6db08d"} Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.920038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.927294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/70300053-7713-4d2c-8e59-a123e9f0f189-etc-swift\") pod \"swift-storage-0\" (UID: \"70300053-7713-4d2c-8e59-a123e9f0f189\") " pod="openstack/swift-storage-0" Mar 20 13:49:13 crc kubenswrapper[4755]: I0320 13:49:13.970622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.368633 4755 generic.go:334] "Generic (PLEG): container finished" podID="6d21386c-8267-4dba-9028-d5cb729ff78b" containerID="d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.368735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerDied","Data":"d6f7605f4c42bfaff2a7ad01f9513a1a2895247ba09bed2e9e8f4f0b129f847f"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.371114 4755 generic.go:334] "Generic (PLEG): container finished" podID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerID="dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.371164 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerDied","Data":"dde41b90a0a015b85da366c64b34cece8929f7524e251d4490d3d25207b86cbc"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.373815 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"c2ca344f-8f18-4dd9-9e5c-44669ff2da4f","Type":"ContainerStarted","Data":"97629bf77590f093931e8bac34f2e3412a85b42a926ed512dbd17949a8e3cb6d"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.374311 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.383883 4755 generic.go:334] "Generic (PLEG): container finished" podID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerID="cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd" exitCode=0 Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.383932 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerDied","Data":"cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd"} Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.467733 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=50.945003195 podStartE2EDuration="1m5.467714821s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.5612671 +0000 UTC m=+1085.159199629" lastFinishedPulling="2026-03-20 13:48:40.083978726 +0000 UTC m=+1099.681911255" observedRunningTime="2026-03-20 13:49:14.46424564 +0000 UTC m=+1134.062178169" watchObservedRunningTime="2026-03-20 13:49:14.467714821 +0000 UTC m=+1134.065647350" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.566570 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.736882 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.827634 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-kbcdp" podUID="408d869f-0966-4908-88e5-37cdff345c4a" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:49:14 crc kubenswrapper[4755]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:49:14 crc kubenswrapper[4755]: > Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.848726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") pod \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.849416 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") pod \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\" (UID: \"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba\") " Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.850254 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" (UID: "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.851182 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.855824 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2" (OuterVolumeSpecName: "kube-api-access-qszp2") pod "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" (UID: "efce7341-ca4a-4b7a-9cfd-7a01ebed00ba"). InnerVolumeSpecName "kube-api-access-qszp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.857076 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-wbxnd" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.951583 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:14 crc kubenswrapper[4755]: I0320 13:49:14.951620 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qszp2\" (UniqueName: \"kubernetes.io/projected/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba-kube-api-access-qszp2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097000 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:15 crc kubenswrapper[4755]: E0320 13:49:15.097569 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097585 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.097842 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" containerName="mariadb-account-create-update" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.098564 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.102029 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.118745 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155393 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155491 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155562 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155592 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155621 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.155672 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.257885 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.259121 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.259792 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261456 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261868 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.261973 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.262080 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.263034 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.263208 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.265504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.293327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"ovn-controller-kbcdp-config-vp4xt\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.393862 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"803b623182fc886142f88d5170663c1a27140b44d9bda1e4a361ee0d2fd977f2"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.399870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6d21386c-8267-4dba-9028-d5cb729ff78b","Type":"ContainerStarted","Data":"9af8c44c5331144d77cbc2a2a9cdec67745083a2a8f79716ac8bf8ff5681077a"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.402375 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405733 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-qhmnv" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405735 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-qhmnv" event={"ID":"efce7341-ca4a-4b7a-9cfd-7a01ebed00ba","Type":"ContainerDied","Data":"1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff"} Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.405869 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d91ca4169d746b344dbc1870b88592b3cfa51c6e71d814610bb8c37bcecadff" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.417800 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:15 crc kubenswrapper[4755]: I0320 13:49:15.428497 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=51.402908463 podStartE2EDuration="1m6.428475913s" podCreationTimestamp="2026-03-20 13:48:09 +0000 UTC" firstStartedPulling="2026-03-20 13:48:25.378905262 +0000 UTC m=+1084.976837801" lastFinishedPulling="2026-03-20 13:48:40.404472722 +0000 UTC m=+1100.002405251" observedRunningTime="2026-03-20 13:49:15.423433122 +0000 UTC m=+1135.021365661" watchObservedRunningTime="2026-03-20 13:49:15.428475913 +0000 UTC m=+1135.026408442" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.004939 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088042 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.088585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.089710 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090172 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090221 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.090294 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") pod \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\" (UID: \"c13f5042-e5e5-47a3-bc96-b504a0bf9af2\") " Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.091721 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.094488 4755 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.095336 4755 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.095216 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57" (OuterVolumeSpecName: "kube-api-access-fzc57") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "kube-api-access-fzc57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.097341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.113477 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.139904 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts" (OuterVolumeSpecName: "scripts") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.146422 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c13f5042-e5e5-47a3-bc96-b504a0bf9af2" (UID: "c13f5042-e5e5-47a3-bc96-b504a0bf9af2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197463 4755 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197506 4755 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197517 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197534 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzc57\" (UniqueName: \"kubernetes.io/projected/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-kube-api-access-fzc57\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.197548 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c13f5042-e5e5-47a3-bc96-b504a0bf9af2-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.420246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.426582 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"25d5a4ff18b5184bc8816465e98810d024fc0341176edca6252e36aef172a9b3"} Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.433902 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-j55xs" Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.434331 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-j55xs" event={"ID":"c13f5042-e5e5-47a3-bc96-b504a0bf9af2","Type":"ContainerDied","Data":"fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5"} Mar 20 13:49:16 crc kubenswrapper[4755]: I0320 13:49:16.434420 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbbde7b8b459780a915ea6b9dfbe411f009ddb8c0323e1589dc63f9e9d42e0a5" Mar 20 13:49:16 crc kubenswrapper[4755]: W0320 13:49:16.446331 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb32031b_f717_4c38_817c_f9c84a6a50e5.slice/crio-701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8 WatchSource:0}: Error finding container 701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8: Status 404 returned error can't find the container with id 701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8 Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444152 4755 generic.go:334] "Generic (PLEG): container finished" podID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerID="bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9" exitCode=0 Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444216 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerDied","Data":"bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.444506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerStarted","Data":"701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.450046 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"386f0d71882d58d04019ab933e4f5489c1d4439122b5590cef0983dded660199"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.450099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"aa7564708ab365be4808ac1e14360380b4b97d7a2bd7bce6b9eccb5e5ef9588e"} Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.772969 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:17 crc kubenswrapper[4755]: I0320 13:49:17.780085 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-qhmnv"] Mar 20 13:49:19 crc kubenswrapper[4755]: I0320 13:49:19.238460 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efce7341-ca4a-4b7a-9cfd-7a01ebed00ba" path="/var/lib/kubelet/pods/efce7341-ca4a-4b7a-9cfd-7a01ebed00ba/volumes" Mar 20 13:49:19 crc kubenswrapper[4755]: I0320 13:49:19.832632 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-kbcdp" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790247 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:22 crc kubenswrapper[4755]: E0320 13:49:22.790618 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790637 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.790885 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c13f5042-e5e5-47a3-bc96-b504a0bf9af2" containerName="swift-ring-rebalance" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.791814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.795048 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.809120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.826736 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.827290 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.929118 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.929191 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.930429 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:22 crc kubenswrapper[4755]: I0320 13:49:22.953203 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"root-account-create-update-jvtvk\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:23 crc kubenswrapper[4755]: I0320 13:49:23.123802 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.325972 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364175 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364291 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364316 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364362 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364351 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364384 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364449 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run" (OuterVolumeSpecName: "var-run") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364410 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.364574 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") pod \"eb32031b-f717-4c38-817c-f9c84a6a50e5\" (UID: \"eb32031b-f717-4c38-817c-f9c84a6a50e5\") " Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365008 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365519 4755 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365560 4755 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365581 4755 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.365604 4755 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/eb32031b-f717-4c38-817c-f9c84a6a50e5-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.366046 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts" (OuterVolumeSpecName: "scripts") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.372169 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp" (OuterVolumeSpecName: "kube-api-access-qwgzp") pod "eb32031b-f717-4c38-817c-f9c84a6a50e5" (UID: "eb32031b-f717-4c38-817c-f9c84a6a50e5"). InnerVolumeSpecName "kube-api-access-qwgzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.467603 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eb32031b-f717-4c38-817c-f9c84a6a50e5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.468083 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwgzp\" (UniqueName: \"kubernetes.io/projected/eb32031b-f717-4c38-817c-f9c84a6a50e5-kube-api-access-qwgzp\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.533981 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-kbcdp-config-vp4xt" event={"ID":"eb32031b-f717-4c38-817c-f9c84a6a50e5","Type":"ContainerDied","Data":"701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8"} Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.534463 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="701f8630edc9ca88fb73141c8c2eddaa733685445def76aae5ebdfb21b2c6bc8" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.534260 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-kbcdp-config-vp4xt" Mar 20 13:49:25 crc kubenswrapper[4755]: I0320 13:49:25.864363 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.459099 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.467332 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-kbcdp-config-vp4xt"] Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546148 4755 generic.go:334] "Generic (PLEG): container finished" podID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerID="b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0" exitCode=0 Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546217 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerDied","Data":"b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.546250 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerStarted","Data":"73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.552725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"bb6427f350f857361e734cc6bbf1571e294e4ed9d2f52827d03d397adb9aac6b"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.558853 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerStarted","Data":"357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56"} Mar 20 13:49:26 crc kubenswrapper[4755]: I0320 13:49:26.590841 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-w78rr" podStartSLOduration=2.986029478 podStartE2EDuration="17.590816844s" podCreationTimestamp="2026-03-20 13:49:09 +0000 UTC" firstStartedPulling="2026-03-20 13:49:10.871080796 +0000 UTC m=+1130.469013345" lastFinishedPulling="2026-03-20 13:49:25.475868182 +0000 UTC m=+1145.073800711" observedRunningTime="2026-03-20 13:49:26.586016528 +0000 UTC m=+1146.183949057" watchObservedRunningTime="2026-03-20 13:49:26.590816844 +0000 UTC m=+1146.188749373" Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.259241 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" path="/var/lib/kubelet/pods/eb32031b-f717-4c38-817c-f9c84a6a50e5/volumes" Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.570846 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"db511d929300fed522174fdba449c70ea7596737d488551f4e2f059cc273bbe3"} Mar 20 13:49:27 crc kubenswrapper[4755]: I0320 13:49:27.900787 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.016463 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") pod \"8ae45e95-b96a-4157-a584-a6eb321d5091\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.016838 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") pod \"8ae45e95-b96a-4157-a584-a6eb321d5091\" (UID: \"8ae45e95-b96a-4157-a584-a6eb321d5091\") " Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.018027 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8ae45e95-b96a-4157-a584-a6eb321d5091" (UID: "8ae45e95-b96a-4157-a584-a6eb321d5091"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.022535 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv" (OuterVolumeSpecName: "kube-api-access-9ncbv") pod "8ae45e95-b96a-4157-a584-a6eb321d5091" (UID: "8ae45e95-b96a-4157-a584-a6eb321d5091"). InnerVolumeSpecName "kube-api-access-9ncbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.118608 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ncbv\" (UniqueName: \"kubernetes.io/projected/8ae45e95-b96a-4157-a584-a6eb321d5091-kube-api-access-9ncbv\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.118923 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8ae45e95-b96a-4157-a584-a6eb321d5091-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581396 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-jvtvk" event={"ID":"8ae45e95-b96a-4157-a584-a6eb321d5091","Type":"ContainerDied","Data":"73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581466 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73ad3ea704ac6fcf7a56f21287d468f7a1e9a438c5cc324450490f6a0cb493ce" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.581490 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-jvtvk" Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587583 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"74b93bc77d9e03ce70262c37ca57793e0e8511f606820a6bf55e586e8e0cd619"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"2a9d8e97b92781ea960da33f5a3c070691cc4640898651227da318e9557b8dd0"} Mar 20 13:49:28 crc kubenswrapper[4755]: I0320 13:49:28.587630 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"4ac3d8248ee41af2077062af84da816bddfd837378ae9df44f304fc24e37aa43"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.616269 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"af20658b5b931396ba82dcef7b91e5c229663e6cf674739e4f96f3c62e654d76"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617337 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"5c0a8a5d65d078c7c6da6d1783e1d0b5e3153a501b388c51d47c6e4cafe3dab3"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"fa752aed9c1f5e29f08081bb21ff47476575077b9084bdf5e56a291b588db25f"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.617377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"66d618d73f3d742120b395cdbc6b49c6a661d26e4cb32676067b85d7a8f031ff"} Mar 20 13:49:30 crc kubenswrapper[4755]: I0320 13:49:30.867871 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.290901 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647884 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"45cd9d5f5e6ca55047b5b4ecba8c9ba341089dc30300d658188fe0fa39e785c3"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"095e77259823fa84683c79c5bd2e9aef61a1906b715026c515fe0b6d5f47296a"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.647971 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"70300053-7713-4d2c-8e59-a123e9f0f189","Type":"ContainerStarted","Data":"d0ca7024be80702ea50db5457c27fdf54d60676f49b089e45f727e61586b553c"} Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.725349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.652563965 podStartE2EDuration="35.725324972s" podCreationTimestamp="2026-03-20 13:48:56 +0000 UTC" firstStartedPulling="2026-03-20 13:49:14.598680932 +0000 UTC m=+1134.196613461" lastFinishedPulling="2026-03-20 13:49:29.671441909 +0000 UTC m=+1149.269374468" observedRunningTime="2026-03-20 13:49:31.71990418 +0000 UTC m=+1151.317836739" watchObservedRunningTime="2026-03-20 13:49:31.725324972 +0000 UTC m=+1151.323257501" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.826399 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:31 crc kubenswrapper[4755]: E0320 13:49:31.827058 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827082 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: E0320 13:49:31.827113 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827120 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827383 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" containerName="mariadb-account-create-update" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.827414 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb32031b-f717-4c38-817c-f9c84a6a50e5" containerName="ovn-config" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.828224 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.836174 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.842109 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.843801 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.868564 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.880749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.892007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.892099 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.984698 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.988055 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994474 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.994586 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.995631 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:31 crc kubenswrapper[4755]: I0320 13:49:31.996406 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.019054 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.020606 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.053970 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.064803 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"cinder-9528-account-create-update-6xkmx\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.096937 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097012 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097038 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097085 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.097824 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.118341 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.121409 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.123368 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.140901 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"cinder-db-create-2jwbt\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.154824 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.160283 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.169102 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.169557 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.187723 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.188645 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200241 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200303 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200401 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200437 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200518 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200578 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.200615 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.201894 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.202391 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.226122 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.241446 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"neutron-db-create-hm9qz\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.253635 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"barbican-db-create-jm9nr\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.300748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.301959 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302023 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302081 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302177 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.302263 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.303419 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.303489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.304198 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.305781 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.305929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.306605 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.310163 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.310770 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.311346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.312801 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.318005 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.319887 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.330599 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.332110 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"dnsmasq-dns-5c79d794d7-x5fhv\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.343901 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.346303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.348315 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.363850 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.404066 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405351 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405445 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405612 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405727 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.405826 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.407394 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.465847 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"barbican-1376-account-create-update-jhbhp\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.488106 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506889 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506933 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.506960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.507004 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.507064 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.509612 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.514434 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.518248 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.518591 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.531947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"keystone-db-sync-9xrbx\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.535187 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"neutron-fc04-account-create-update-x9t57\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.616258 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.661824 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.677905 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.706897 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerStarted","Data":"ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9"} Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.755423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.971971 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:49:32 crc kubenswrapper[4755]: I0320 13:49:32.999249 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.009965 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.103521 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.454310 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:49:33 crc kubenswrapper[4755]: W0320 13:49:33.468867 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34c85756_25cf_4302_bd5d_72f2e459f562.slice/crio-b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00 WatchSource:0}: Error finding container b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00: Status 404 returned error can't find the container with id b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.472319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:49:33 crc kubenswrapper[4755]: W0320 13:49:33.479232 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64ad8e64_0606_4171_bd2d_ae8212fdff8f.slice/crio-ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21 WatchSource:0}: Error finding container ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21: Status 404 returned error can't find the container with id ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.716562 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerStarted","Data":"ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719165 4755 generic.go:334] "Generic (PLEG): container finished" podID="e38d31ac-eae6-4cd1-be04-304215db852a" containerID="c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719284 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerDied","Data":"c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.719326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerStarted","Data":"842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.726215 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerStarted","Data":"b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727493 4755 generic.go:334] "Generic (PLEG): container finished" podID="feb55e83-711d-4561-8b57-2a231944e1b1" containerID="705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerDied","Data":"705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.727560 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerStarted","Data":"bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729037 4755 generic.go:334] "Generic (PLEG): container finished" podID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerID="035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729089 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerDied","Data":"035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.729104 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerStarted","Data":"7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.730129 4755 generic.go:334] "Generic (PLEG): container finished" podID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerID="02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.730184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerDied","Data":"02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731388 4755 generic.go:334] "Generic (PLEG): container finished" podID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerID="60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731470 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerDied","Data":"60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.731499 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerStarted","Data":"040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732561 4755 generic.go:334] "Generic (PLEG): container finished" podID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" exitCode=0 Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732604 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788"} Mar 20 13:49:33 crc kubenswrapper[4755]: I0320 13:49:33.732617 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerStarted","Data":"61e599f2b06be42605f3f5420cbd417da830635ec8940109cba58f789bbd856c"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.743818 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerStarted","Data":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.745741 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.747862 4755 generic.go:334] "Generic (PLEG): container finished" podID="34c85756-25cf-4302-bd5d-72f2e459f562" containerID="cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2" exitCode=0 Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.748008 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerDied","Data":"cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.751484 4755 generic.go:334] "Generic (PLEG): container finished" podID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerID="357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56" exitCode=0 Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.751535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerDied","Data":"357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56"} Mar 20 13:49:34 crc kubenswrapper[4755]: I0320 13:49:34.771187 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" podStartSLOduration=2.771168695 podStartE2EDuration="2.771168695s" podCreationTimestamp="2026-03-20 13:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:34.764032317 +0000 UTC m=+1154.361964846" watchObservedRunningTime="2026-03-20 13:49:34.771168695 +0000 UTC m=+1154.369101224" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.144541 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.210425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") pod \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.210587 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") pod \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\" (UID: \"8c5d05dc-a589-4d2e-9374-0d57202a3cfc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.214586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c5d05dc-a589-4d2e-9374-0d57202a3cfc" (UID: "8c5d05dc-a589-4d2e-9374-0d57202a3cfc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.218562 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7" (OuterVolumeSpecName: "kube-api-access-n5xn7") pod "8c5d05dc-a589-4d2e-9374-0d57202a3cfc" (UID: "8c5d05dc-a589-4d2e-9374-0d57202a3cfc"). InnerVolumeSpecName "kube-api-access-n5xn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.285856 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.291338 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.302448 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.309833 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.314349 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.314395 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5xn7\" (UniqueName: \"kubernetes.io/projected/8c5d05dc-a589-4d2e-9374-0d57202a3cfc-kube-api-access-n5xn7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416019 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") pod \"feb55e83-711d-4561-8b57-2a231944e1b1\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416435 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") pod \"e38d31ac-eae6-4cd1-be04-304215db852a\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") pod \"e38d31ac-eae6-4cd1-be04-304215db852a\" (UID: \"e38d31ac-eae6-4cd1-be04-304215db852a\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416549 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") pod \"5dde547e-5fce-4868-ba0e-63650ea0c771\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416626 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") pod \"5dde547e-5fce-4868-ba0e-63650ea0c771\" (UID: \"5dde547e-5fce-4868-ba0e-63650ea0c771\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416693 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") pod \"feb55e83-711d-4561-8b57-2a231944e1b1\" (UID: \"feb55e83-711d-4561-8b57-2a231944e1b1\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") pod \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.416860 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") pod \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\" (UID: \"015c8ae7-1856-4b0c-b5ce-e2503a2080dc\") " Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417441 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e38d31ac-eae6-4cd1-be04-304215db852a" (UID: "e38d31ac-eae6-4cd1-be04-304215db852a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417576 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5dde547e-5fce-4868-ba0e-63650ea0c771" (UID: "5dde547e-5fce-4868-ba0e-63650ea0c771"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417620 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "feb55e83-711d-4561-8b57-2a231944e1b1" (UID: "feb55e83-711d-4561-8b57-2a231944e1b1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.417586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "015c8ae7-1856-4b0c-b5ce-e2503a2080dc" (UID: "015c8ae7-1856-4b0c-b5ce-e2503a2080dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.421447 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk" (OuterVolumeSpecName: "kube-api-access-frkpk") pod "e38d31ac-eae6-4cd1-be04-304215db852a" (UID: "e38d31ac-eae6-4cd1-be04-304215db852a"). InnerVolumeSpecName "kube-api-access-frkpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.422328 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5" (OuterVolumeSpecName: "kube-api-access-82wd5") pod "feb55e83-711d-4561-8b57-2a231944e1b1" (UID: "feb55e83-711d-4561-8b57-2a231944e1b1"). InnerVolumeSpecName "kube-api-access-82wd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.423759 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p" (OuterVolumeSpecName: "kube-api-access-wlq6p") pod "015c8ae7-1856-4b0c-b5ce-e2503a2080dc" (UID: "015c8ae7-1856-4b0c-b5ce-e2503a2080dc"). InnerVolumeSpecName "kube-api-access-wlq6p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.423790 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7" (OuterVolumeSpecName: "kube-api-access-98tb7") pod "5dde547e-5fce-4868-ba0e-63650ea0c771" (UID: "5dde547e-5fce-4868-ba0e-63650ea0c771"). InnerVolumeSpecName "kube-api-access-98tb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520113 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98tb7\" (UniqueName: \"kubernetes.io/projected/5dde547e-5fce-4868-ba0e-63650ea0c771-kube-api-access-98tb7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520157 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82wd5\" (UniqueName: \"kubernetes.io/projected/feb55e83-711d-4561-8b57-2a231944e1b1-kube-api-access-82wd5\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520170 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520179 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlq6p\" (UniqueName: \"kubernetes.io/projected/015c8ae7-1856-4b0c-b5ce-e2503a2080dc-kube-api-access-wlq6p\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520189 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/feb55e83-711d-4561-8b57-2a231944e1b1-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frkpk\" (UniqueName: \"kubernetes.io/projected/e38d31ac-eae6-4cd1-be04-304215db852a-kube-api-access-frkpk\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520206 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e38d31ac-eae6-4cd1-be04-304215db852a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.520219 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5dde547e-5fce-4868-ba0e-63650ea0c771-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.766934 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-jm9nr" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.766917 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-jm9nr" event={"ID":"feb55e83-711d-4561-8b57-2a231944e1b1","Type":"ContainerDied","Data":"bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.767116 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc3ad24f7ade3ad796eff6e50df9588992f2a7cd48d88e2aeaa5ee71e47caecd" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770357 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-9528-account-create-update-6xkmx" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770353 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-9528-account-create-update-6xkmx" event={"ID":"5dde547e-5fce-4868-ba0e-63650ea0c771","Type":"ContainerDied","Data":"7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.770638 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b99045593a924f15277584d13d0931a3b67834972c9adc02f24a8927b46e3c3" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2jwbt" event={"ID":"8c5d05dc-a589-4d2e-9374-0d57202a3cfc","Type":"ContainerDied","Data":"ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772560 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea2b778d5d0a296e85801408983c6ae18b5c4259e0281b07ac67d0e2cc8163c9" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.772679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2jwbt" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774377 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-1376-account-create-update-jhbhp" event={"ID":"015c8ae7-1856-4b0c-b5ce-e2503a2080dc","Type":"ContainerDied","Data":"040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774402 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="040880eed8633e76726b57ccd8d767c3e70fc0a48b5aadc1240fac1c3e68983f" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.774402 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-1376-account-create-update-jhbhp" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776123 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-hm9qz" event={"ID":"e38d31ac-eae6-4cd1-be04-304215db852a","Type":"ContainerDied","Data":"842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029"} Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776187 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="842094ddef6d47f893f153b311a559660a91386da7c1ac1005b8b0a242455029" Mar 20 13:49:35 crc kubenswrapper[4755]: I0320 13:49:35.776293 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-hm9qz" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.751621 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.752322 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.752422 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.754435 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:49:36 crc kubenswrapper[4755]: I0320 13:49:36.754733 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" gracePeriod=600 Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840206 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" exitCode=0 Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840271 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773"} Mar 20 13:49:37 crc kubenswrapper[4755]: I0320 13:49:37.840321 4755 scope.go:117] "RemoveContainer" containerID="d4ef017003069a41260d618026991304ac053060f31df357b6bf383a8143ed38" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.584013 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.591146 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735153 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") pod \"34c85756-25cf-4302-bd5d-72f2e459f562\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735440 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735489 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") pod \"34c85756-25cf-4302-bd5d-72f2e459f562\" (UID: \"34c85756-25cf-4302-bd5d-72f2e459f562\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.735544 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") pod \"3047e6fe-5128-4361-bede-e9f0c4e9387c\" (UID: \"3047e6fe-5128-4361-bede-e9f0c4e9387c\") " Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.736923 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "34c85756-25cf-4302-bd5d-72f2e459f562" (UID: "34c85756-25cf-4302-bd5d-72f2e459f562"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.743360 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2" (OuterVolumeSpecName: "kube-api-access-sxsh2") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "kube-api-access-sxsh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.743368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.744283 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7" (OuterVolumeSpecName: "kube-api-access-99pj7") pod "34c85756-25cf-4302-bd5d-72f2e459f562" (UID: "34c85756-25cf-4302-bd5d-72f2e459f562"). InnerVolumeSpecName "kube-api-access-99pj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.773045 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.794977 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data" (OuterVolumeSpecName: "config-data") pod "3047e6fe-5128-4361-bede-e9f0c4e9387c" (UID: "3047e6fe-5128-4361-bede-e9f0c4e9387c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837472 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsh2\" (UniqueName: \"kubernetes.io/projected/3047e6fe-5128-4361-bede-e9f0c4e9387c-kube-api-access-sxsh2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837501 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837511 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837520 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99pj7\" (UniqueName: \"kubernetes.io/projected/34c85756-25cf-4302-bd5d-72f2e459f562-kube-api-access-99pj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837529 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3047e6fe-5128-4361-bede-e9f0c4e9387c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.837538 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/34c85756-25cf-4302-bd5d-72f2e459f562-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fc04-account-create-update-x9t57" event={"ID":"34c85756-25cf-4302-bd5d-72f2e459f562","Type":"ContainerDied","Data":"b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00"} Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854535 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b815284f847f4e1cd2a24d0712626799335f15b7cef85a3bb219a3d7dddfee00" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.854618 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fc04-account-create-update-x9t57" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-w78rr" event={"ID":"3047e6fe-5128-4361-bede-e9f0c4e9387c","Type":"ContainerDied","Data":"035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499"} Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862849 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="035267f344c2c393db92cbe47eba53034472523719d5e4bdaa97232f00452499" Mar 20 13:49:38 crc kubenswrapper[4755]: I0320 13:49:38.862917 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-w78rr" Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.878753 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.881956 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerStarted","Data":"291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6"} Mar 20 13:49:39 crc kubenswrapper[4755]: I0320 13:49:39.939592 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-9xrbx" podStartSLOduration=2.833014065 podStartE2EDuration="7.939564659s" podCreationTimestamp="2026-03-20 13:49:32 +0000 UTC" firstStartedPulling="2026-03-20 13:49:33.490335216 +0000 UTC m=+1153.088267745" lastFinishedPulling="2026-03-20 13:49:38.59688579 +0000 UTC m=+1158.194818339" observedRunningTime="2026-03-20 13:49:39.931328493 +0000 UTC m=+1159.529261042" watchObservedRunningTime="2026-03-20 13:49:39.939564659 +0000 UTC m=+1159.537497198" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.170704 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.171091 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" containerID="cri-o://ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" gracePeriod=10 Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.182067 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.240750 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241859 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241878 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241894 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241900 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241924 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241931 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241940 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241947 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241958 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.241964 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.241994 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242004 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: E0320 13:49:40.242018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242025 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242230 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242243 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242262 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" containerName="mariadb-database-create" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242271 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242290 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242307 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" containerName="mariadb-account-create-update" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.242317 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" containerName="glance-db-sync" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.243607 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.253287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369750 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.369948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370077 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370419 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.370486 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.473872 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.473945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474044 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474090 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474113 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.474178 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.477348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.478603 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.479303 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.502378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"dnsmasq-dns-5f59b8f679-x266l\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.628938 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.750062 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.781793 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782085 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782140 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782261 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.782328 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") pod \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\" (UID: \"87a11166-3f5f-4f57-a8ba-19f88c636ee7\") " Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.800945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg" (OuterVolumeSpecName: "kube-api-access-jq7rg") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "kube-api-access-jq7rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.829039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.848424 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.853458 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.868553 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.883980 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config" (OuterVolumeSpecName: "config") pod "87a11166-3f5f-4f57-a8ba-19f88c636ee7" (UID: "87a11166-3f5f-4f57-a8ba-19f88c636ee7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884272 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884298 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jq7rg\" (UniqueName: \"kubernetes.io/projected/87a11166-3f5f-4f57-a8ba-19f88c636ee7-kube-api-access-jq7rg\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884315 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884327 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884337 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.884350 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87a11166-3f5f-4f57-a8ba-19f88c636ee7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.899359 4755 generic.go:334] "Generic (PLEG): container finished" podID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" exitCode=0 Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.900029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.906024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" event={"ID":"87a11166-3f5f-4f57-a8ba-19f88c636ee7","Type":"ContainerDied","Data":"61e599f2b06be42605f3f5420cbd417da830635ec8940109cba58f789bbd856c"} Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.906773 4755 scope.go:117] "RemoveContainer" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.907370 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-x5fhv" Mar 20 13:49:40 crc kubenswrapper[4755]: I0320 13:49:40.977796 4755 scope.go:117] "RemoveContainer" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.003931 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.019018 4755 scope.go:117] "RemoveContainer" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:41 crc kubenswrapper[4755]: E0320 13:49:41.020604 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": container with ID starting with ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11 not found: ID does not exist" containerID="ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.020646 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11"} err="failed to get container status \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": rpc error: code = NotFound desc = could not find container \"ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11\": container with ID starting with ae4a842d6a272007fe21e8616134f3d00713ca7ba2aeda1d38a274566d482f11 not found: ID does not exist" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.020685 4755 scope.go:117] "RemoveContainer" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: E0320 13:49:41.021598 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": container with ID starting with 27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788 not found: ID does not exist" containerID="27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.021697 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788"} err="failed to get container status \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": rpc error: code = NotFound desc = could not find container \"27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788\": container with ID starting with 27d8f8ea35f97f7166081ed052bf53ba28013e3c7df846a4f7a6b0bc8fdf5788 not found: ID does not exist" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.021830 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-x5fhv"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.142693 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.239932 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" path="/var/lib/kubelet/pods/87a11166-3f5f-4f57-a8ba-19f88c636ee7/volumes" Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.913675 4755 generic.go:334] "Generic (PLEG): container finished" podID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerID="e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc" exitCode=0 Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.913802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc"} Mar 20 13:49:41 crc kubenswrapper[4755]: I0320 13:49:41.914223 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerStarted","Data":"35342b414d35d91006e710a0dacee45f33a514dab176d4298d187fb90fe3be69"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.926507 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerStarted","Data":"2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.926820 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.928999 4755 generic.go:334] "Generic (PLEG): container finished" podID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerID="291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6" exitCode=0 Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.929053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerDied","Data":"291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6"} Mar 20 13:49:42 crc kubenswrapper[4755]: I0320 13:49:42.958046 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" podStartSLOduration=2.958025964 podStartE2EDuration="2.958025964s" podCreationTimestamp="2026-03-20 13:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:42.951847632 +0000 UTC m=+1162.549780202" watchObservedRunningTime="2026-03-20 13:49:42.958025964 +0000 UTC m=+1162.555958483" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.383244 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409068 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409376 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.409765 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") pod \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\" (UID: \"64ad8e64-0606-4171-bd2d-ae8212fdff8f\") " Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.444969 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7" (OuterVolumeSpecName: "kube-api-access-2k6t7") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "kube-api-access-2k6t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.472996 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.509352 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data" (OuterVolumeSpecName: "config-data") pod "64ad8e64-0606-4171-bd2d-ae8212fdff8f" (UID: "64ad8e64-0606-4171-bd2d-ae8212fdff8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520172 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520227 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k6t7\" (UniqueName: \"kubernetes.io/projected/64ad8e64-0606-4171-bd2d-ae8212fdff8f-kube-api-access-2k6t7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.520245 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64ad8e64-0606-4171-bd2d-ae8212fdff8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950471 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-9xrbx" event={"ID":"64ad8e64-0606-4171-bd2d-ae8212fdff8f","Type":"ContainerDied","Data":"ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21"} Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950520 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd60306421da63e9395ed5fc49b2d20ea36652b391d1fdf8e39d3f282043e21" Mar 20 13:49:44 crc kubenswrapper[4755]: I0320 13:49:44.950599 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-9xrbx" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.285850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.286827 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" containerID="cri-o://2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" gracePeriod=10 Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.333243 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334160 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334292 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334367 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="init" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334416 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="init" Mar 20 13:49:45 crc kubenswrapper[4755]: E0320 13:49:45.334491 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334552 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334816 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="87a11166-3f5f-4f57-a8ba-19f88c636ee7" containerName="dnsmasq-dns" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.334897 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" containerName="keystone-db-sync" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.335743 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339325 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339686 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.339819 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.341097 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.343317 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.347481 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.349751 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.367869 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.374476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441735 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441783 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441802 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441951 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441971 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.441989 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.442010 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.442046 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544203 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544225 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544241 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544298 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544362 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544384 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544404 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544429 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.544467 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.548984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558207 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.558483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.559695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.560142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.560430 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.561724 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.570558 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.573897 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.575576 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.582018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.588839 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.589294 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.589770 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-8gg2p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.590453 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.597058 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.598129 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603303 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603379 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.603541 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4kqwk" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.616484 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"dnsmasq-dns-bbf5cc879-vqvbt\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.629737 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.641361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"keystone-bootstrap-7d8ml\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645549 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645594 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645719 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645773 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645792 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.645819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.658061 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.679794 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.684932 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.721428 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.727498 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732635 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ncc5q" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732851 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.732864 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751821 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751844 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751864 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751893 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751918 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751935 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751963 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.751984 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752047 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752088 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.752495 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.754338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.755239 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.759936 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.774040 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.774251 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.800848 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.813672 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.820206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"horizon-8c6c7fc7-qs6b6\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.823092 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"neutron-db-sync-52m67\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.827499 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.828752 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.829441 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.829635 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t52g6" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.830070 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.831850 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nvndn" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.834615 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862356 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862398 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862459 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.862486 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864191 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864465 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864599 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864618 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864677 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864705 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864725 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.864913 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.867498 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.871623 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.871733 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.892093 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.905371 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"placement-db-sync-cxr9p\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.905453 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.947246 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.963228 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986131 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986411 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986638 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986735 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986819 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.986907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.987001 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:45 crc kubenswrapper[4755]: I0320 13:49:45.987093 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.002062 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.003296 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.006493 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.009279 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.010153 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.010822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.014169 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.015779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.019306 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.032822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"cinder-db-sync-jrf8c\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.033894 4755 generic.go:334] "Generic (PLEG): container finished" podID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerID="2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" exitCode=0 Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.034113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9"} Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.054379 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.056475 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.056548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"barbican-db-sync-dtggj\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.075880 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.077496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.089827 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090129 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090199 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090221 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090287 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.090425 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.098185 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.114500 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.160979 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183190 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183241 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.183354 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.186481 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187299 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187563 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.187602 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.191326 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.191747 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.199477 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207392 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207478 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207606 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207674 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207800 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.207968 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208140 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208303 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.208360 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.209524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.209725 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210034 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.210759 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.212831 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.212910 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.215882 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.216350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.218058 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.222479 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.228676 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.257535 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"horizon-74d5b88dcf-ftnlg\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.261520 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"dnsmasq-dns-56df8fb6b7-r7dnf\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311365 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311844 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311919 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311945 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.311985 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312011 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312036 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312058 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312082 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312134 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312202 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.312252 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414383 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414545 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414568 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414627 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414646 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414682 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414703 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414721 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414741 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414762 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.414788 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.417773 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.421182 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.424197 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.425958 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.430496 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.431971 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.433253 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.434897 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.435895 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.435943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.437041 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.439333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.439388 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.452957 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.454353 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.482160 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.494266 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ceilometer-0\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.518744 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.520919 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"glance-default-external-api-0\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.544508 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.617888 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.625443 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721743 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721863 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721909 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.721984 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.722150 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") pod \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\" (UID: \"1111c2ae-21ad-47b4-9ec0-e51d507a864e\") " Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.728278 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p" (OuterVolumeSpecName: "kube-api-access-69w7p") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "kube-api-access-69w7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.813956 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: E0320 13:49:46.814580 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="init" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="init" Mar 20 13:49:46 crc kubenswrapper[4755]: E0320 13:49:46.814635 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814644 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.814924 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" containerName="dnsmasq-dns" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.827316 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.831514 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69w7p\" (UniqueName: \"kubernetes.io/projected/1111c2ae-21ad-47b4-9ec0-e51d507a864e-kube-api-access-69w7p\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.833274 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.834797 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.836294 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.841196 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.844972 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.859489 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config" (OuterVolumeSpecName: "config") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.866375 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.879280 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.914126 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1111c2ae-21ad-47b4-9ec0-e51d507a864e" (UID: "1111c2ae-21ad-47b4-9ec0-e51d507a864e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934829 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934915 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.934993 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935069 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935242 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935490 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935516 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935532 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935546 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.935555 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1111c2ae-21ad-47b4-9ec0-e51d507a864e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.936683 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:46 crc kubenswrapper[4755]: I0320 13:49:46.947430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 13:49:46 crc kubenswrapper[4755]: W0320 13:49:46.951602 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e3b9192_0e1c_4c85_82de_3a54a4272c48.slice/crio-bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5 WatchSource:0}: Error finding container bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5: Status 404 returned error can't find the container with id bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5 Mar 20 13:49:46 crc kubenswrapper[4755]: W0320 13:49:46.955267 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69707be4_e338_4e13_8ecc_8cfd7cd416b2.slice/crio-312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997 WatchSource:0}: Error finding container 312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997: Status 404 returned error can't find the container with id 312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997 Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042671 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042717 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042768 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042789 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042837 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042862 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042878 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.042920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.043369 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.043410 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.045991 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.050114 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.051378 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.056878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.057417 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.075233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.076053 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-x266l" event={"ID":"1111c2ae-21ad-47b4-9ec0-e51d507a864e","Type":"ContainerDied","Data":"35342b414d35d91006e710a0dacee45f33a514dab176d4298d187fb90fe3be69"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.076134 4755 scope.go:117] "RemoveContainer" containerID="2fe1566c01ae52fa91ff1c91f685651bb26a1b16eee31f217ade240877fac5f9" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.078990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.094439 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerStarted","Data":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.095000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerStarted","Data":"187f0aeadf4e124f27492aa0d9af1cc05ae52352a5bbdb94c65aff34e13e285c"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.101792 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c6c7fc7-qs6b6" event={"ID":"7e3b9192-0e1c-4c85-82de-3a54a4272c48","Type":"ContainerStarted","Data":"bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.103040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerStarted","Data":"7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.123726 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerStarted","Data":"312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997"} Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.124033 4755 scope.go:117] "RemoveContainer" containerID="e01d91a7e23ddb2ed11973d411a56ca34359b15e81493c7a678686b1bd95c9dc" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.147023 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: W0320 13:49:47.163384 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ea35a84_68ca_4490_b1d9_fa999ef63ebe.slice/crio-7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662 WatchSource:0}: Error finding container 7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662: Status 404 returned error can't find the container with id 7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662 Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.169747 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.175542 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.212211 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.254972 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-x266l"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.343043 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.373149 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.429770 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:47 crc kubenswrapper[4755]: W0320 13:49:47.528931 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e WatchSource:0}: Error finding container aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e: Status 404 returned error can't find the container with id aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.530288 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.578749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.814433 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862458 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862539 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862623 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862791 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.862854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") pod \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\" (UID: \"4132d383-5c0f-4d4f-9622-c0e5c41d6568\") " Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.881823 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6" (OuterVolumeSpecName: "kube-api-access-gn7v6") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "kube-api-access-gn7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.899784 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.914444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.953462 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config" (OuterVolumeSpecName: "config") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.955287 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966225 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4132d383-5c0f-4d4f-9622-c0e5c41d6568" (UID: "4132d383-5c0f-4d4f-9622-c0e5c41d6568"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966578 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966617 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966631 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966642 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn7v6\" (UniqueName: \"kubernetes.io/projected/4132d383-5c0f-4d4f-9622-c0e5c41d6568-kube-api-access-gn7v6\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966773 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.966785 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4132d383-5c0f-4d4f-9622-c0e5c41d6568-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:47 crc kubenswrapper[4755]: I0320 13:49:47.993005 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.086138 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: W0320 13:49:48.140361 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d02dff6_d832_40b7_8291_f7f08be96659.slice/crio-c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c WatchSource:0}: Error finding container c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c: Status 404 returned error can't find the container with id c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.147374 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerStarted","Data":"7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.149813 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"fa39cf0d73d745c483cbf46583864f4385b896f58bc8b0da2a658f1a87cd2c55"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153109 4755 generic.go:334] "Generic (PLEG): container finished" podID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" exitCode=0 Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerDied","Data":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" event={"ID":"4132d383-5c0f-4d4f-9622-c0e5c41d6568","Type":"ContainerDied","Data":"187f0aeadf4e124f27492aa0d9af1cc05ae52352a5bbdb94c65aff34e13e285c"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153197 4755 scope.go:117] "RemoveContainer" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.153267 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-vqvbt" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.163817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerStarted","Data":"bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.171349 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.187500 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerStarted","Data":"39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.193476 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerStarted","Data":"46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.199861 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7d8ml" podStartSLOduration=3.199840168 podStartE2EDuration="3.199840168s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.185504537 +0000 UTC m=+1167.783437136" watchObservedRunningTime="2026-03-20 13:49:48.199840168 +0000 UTC m=+1167.797772697" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.210262 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerStarted","Data":"88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.233128 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.244544 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-vqvbt"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.245385 4755 scope.go:117] "RemoveContainer" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: E0320 13:49:48.247138 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": container with ID starting with 725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea not found: ID does not exist" containerID="725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.247190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea"} err="failed to get container status \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": rpc error: code = NotFound desc = could not find container \"725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea\": container with ID starting with 725dd6b314526e9f683bd294499e7666300e620f47a4a36e28a5bba45dc84aea not found: ID does not exist" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.318971 4755 generic.go:334] "Generic (PLEG): container finished" podID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerID="e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295" exitCode=0 Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.319063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.319107 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerStarted","Data":"f4cea975e04082628cd5787018d084f226541b1327d170cee0f4b957229de5d6"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.347988 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerStarted","Data":"935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb"} Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.362251 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-52m67" podStartSLOduration=3.362231364 podStartE2EDuration="3.362231364s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.280424664 +0000 UTC m=+1167.878357193" watchObservedRunningTime="2026-03-20 13:49:48.362231364 +0000 UTC m=+1167.960163893" Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.781312 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.957458 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:49:48 crc kubenswrapper[4755]: I0320 13:49:48.974376 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.070323 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120330 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:49 crc kubenswrapper[4755]: E0320 13:49:49.120737 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120753 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.120923 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" containerName="init" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.121826 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.149458 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.247320 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1111c2ae-21ad-47b4-9ec0-e51d507a864e" path="/var/lib/kubelet/pods/1111c2ae-21ad-47b4-9ec0-e51d507a864e/volumes" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.248139 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4132d383-5c0f-4d4f-9622-c0e5c41d6568" path="/var/lib/kubelet/pods/4132d383-5c0f-4d4f-9622-c0e5c41d6568/volumes" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304105 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304185 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304441 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.304501 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.381148 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerStarted","Data":"70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f"} Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.381235 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.386455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c"} Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.409346 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podStartSLOduration=4.409326772 podStartE2EDuration="4.409326772s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:49.405623336 +0000 UTC m=+1169.003555865" watchObservedRunningTime="2026-03-20 13:49:49.409326772 +0000 UTC m=+1169.007259301" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.416689 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.417125 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.417331 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.418108 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.418184 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.419732 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.421011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.421173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.427357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.441162 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"horizon-6fb9d46f97-rdkvb\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:49 crc kubenswrapper[4755]: I0320 13:49:49.458196 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.137818 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:50 crc kubenswrapper[4755]: W0320 13:49:50.161837 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827 WatchSource:0}: Error finding container c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827: Status 404 returned error can't find the container with id c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827 Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.416029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8"} Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.429236 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c"} Mar 20 13:49:50 crc kubenswrapper[4755]: I0320 13:49:50.432074 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446003 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerStarted","Data":"d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446119 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" containerID="cri-o://64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.446180 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" containerID="cri-o://d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449004 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerStarted","Data":"54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb"} Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449349 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" containerID="cri-o://6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.449366 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" containerID="cri-o://54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" gracePeriod=30 Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.485917 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.485879735 podStartE2EDuration="6.485879735s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:51.478273209 +0000 UTC m=+1171.076205738" watchObservedRunningTime="2026-03-20 13:49:51.485879735 +0000 UTC m=+1171.083812264" Mar 20 13:49:51 crc kubenswrapper[4755]: I0320 13:49:51.515332 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.515313643 podStartE2EDuration="6.515313643s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:51.503580931 +0000 UTC m=+1171.101513460" watchObservedRunningTime="2026-03-20 13:49:51.515313643 +0000 UTC m=+1171.113246172" Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.480178 4755 generic.go:334] "Generic (PLEG): container finished" podID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerID="bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.480262 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerDied","Data":"bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486123 4755 generic.go:334] "Generic (PLEG): container finished" podID="1d02dff6-d832-40b7-8291-f7f08be96659" containerID="54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486160 4755 generic.go:334] "Generic (PLEG): container finished" podID="1d02dff6-d832-40b7-8291-f7f08be96659" containerID="6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" exitCode=143 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.486249 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499095 4755 generic.go:334] "Generic (PLEG): container finished" podID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerID="d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499156 4755 generic.go:334] "Generic (PLEG): container finished" podID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerID="64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" exitCode=143 Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc"} Mar 20 13:49:52 crc kubenswrapper[4755]: I0320 13:49:52.499293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c"} Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.342721 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.402126 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.404818 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.408738 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.447243 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480278 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480372 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480446 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480598 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480689 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.480941 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.527953 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.543771 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.546197 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.557052 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583809 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583891 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583921 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583944 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.583991 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584029 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584063 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584113 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584146 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584165 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584188 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.584277 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.587802 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.591096 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.593346 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.610299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.617573 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.622229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.623156 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"horizon-54fd48b444-c4c9l\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.687676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688027 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2af5836e-8c76-4432-95c0-ef34d6fc3528-logs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688710 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688838 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.688880 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689294 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689357 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.689881 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-scripts\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.690740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2af5836e-8c76-4432-95c0-ef34d6fc3528-config-data\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.692789 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-tls-certs\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.693297 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-combined-ca-bundle\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.707292 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2af5836e-8c76-4432-95c0-ef34d6fc3528-horizon-secret-key\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.710326 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xxh\" (UniqueName: \"kubernetes.io/projected/2af5836e-8c76-4432-95c0-ef34d6fc3528-kube-api-access-88xxh\") pod \"horizon-9f7d4c74d-t7tpq\" (UID: \"2af5836e-8c76-4432-95c0-ef34d6fc3528\") " pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.747897 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:49:54 crc kubenswrapper[4755]: I0320 13:49:54.868646 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.432210 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.502160 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:49:56 crc kubenswrapper[4755]: I0320 13:49:56.502411 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" containerID="cri-o://0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" gracePeriod=10 Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.192771 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.569251 4755 generic.go:334] "Generic (PLEG): container finished" podID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerID="0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" exitCode=0 Mar 20 13:49:57 crc kubenswrapper[4755]: I0320 13:49:57.569332 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141"} Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.136209 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.138442 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147247 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147573 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.147795 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.149632 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.312002 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.414132 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.434191 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"auto-csr-approver-29566910-g8cp9\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:00 crc kubenswrapper[4755]: I0320 13:50:00.467584 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:02 crc kubenswrapper[4755]: I0320 13:50:02.192366 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:07 crc kubenswrapper[4755]: I0320 13:50:07.191863 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:07 crc kubenswrapper[4755]: I0320 13:50:07.192523 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.837608 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.838177 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m97kr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-cxr9p_openstack(7ea35a84-68ca-4490-b1d9-fa999ef63ebe): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:09 crc kubenswrapper[4755]: E0320 13:50:09.839638 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-cxr9p" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.138098 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.138421 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbbh67hfh64fhc6h694hb8h5c7h58fh6bh5f5h58bh5b8h5d9h5d7h65dh688h65bh678h598h95h668hb7hb9h6fh67dh555h69hbbh5f8h665h555q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxd7x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(ded8942b-87a3-49fa-80fb-dc830c09f18d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.205467 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.205637 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n58h8ch55fh84h94hf8h96h5h594h564h694h594h99h85hb7h564h647h66dh576hd6h597hb9h659hf5h65ch5c9h8fh5f4h66fhf6h55ch567q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tjbkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-8c6c7fc7-qs6b6_openstack(7e3b9192-0e1c-4c85-82de-3a54a4272c48): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.208070 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-8c6c7fc7-qs6b6" podUID="7e3b9192-0e1c-4c85-82de-3a54a4272c48" Mar 20 13:50:10 crc kubenswrapper[4755]: I0320 13:50:10.693212 4755 generic.go:334] "Generic (PLEG): container finished" podID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerID="46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967" exitCode=0 Mar 20 13:50:10 crc kubenswrapper[4755]: I0320 13:50:10.693308 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerDied","Data":"46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967"} Mar 20 13:50:10 crc kubenswrapper[4755]: E0320 13:50:10.695869 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-cxr9p" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.192817 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.119:5353: connect: connection refused" Mar 20 13:50:12 crc kubenswrapper[4755]: E0320 13:50:12.288301 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 20 13:50:12 crc kubenswrapper[4755]: E0320 13:50:12.288482 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n697h574h5ffh57bh675hd4h5f4hf4h565h67fh696h96h58fh697hd7h89h5dbh86h5f9h545h5d4h5c7h5c6h5c5hd7h5cfh94h549h676h7bh657h668q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mv45n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-74d5b88dcf-ftnlg_openstack(2f75cbbe-c852-4090-aca4-42cd87a3a9b3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.373527 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500427 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500487 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.500590 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") pod \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\" (UID: \"f52d787e-af63-491a-a3f9-2a9626a9f8b8\") " Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.507911 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.509906 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts" (OuterVolumeSpecName: "scripts") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.527897 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh" (OuterVolumeSpecName: "kube-api-access-jr7jh") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "kube-api-access-jr7jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.533851 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.538194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.561109 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data" (OuterVolumeSpecName: "config-data") pod "f52d787e-af63-491a-a3f9-2a9626a9f8b8" (UID: "f52d787e-af63-491a-a3f9-2a9626a9f8b8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603149 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603187 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603197 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr7jh\" (UniqueName: \"kubernetes.io/projected/f52d787e-af63-491a-a3f9-2a9626a9f8b8-kube-api-access-jr7jh\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603207 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603214 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.603222 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f52d787e-af63-491a-a3f9-2a9626a9f8b8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7d8ml" event={"ID":"f52d787e-af63-491a-a3f9-2a9626a9f8b8","Type":"ContainerDied","Data":"7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6"} Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721179 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a344e33e76b6340b3d36c5febb65925d0d2e9247a3cde79674c8ed947f90df6" Mar 20 13:50:12 crc kubenswrapper[4755]: I0320 13:50:12.721297 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7d8ml" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.469756 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.481683 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7d8ml"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.570604 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.571111 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571134 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571303 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" containerName="keystone-bootstrap" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.571914 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573666 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573965 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.573980 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.574820 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.582785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.592153 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625571 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625796 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625853 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.625895 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727901 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.727984 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.728065 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.728089 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733383 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733491 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.733499 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.735524 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.736541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.745523 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"keystone-bootstrap-rwsvb\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.880845 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.881057 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fl4vg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-jrf8c_openstack(25bd1da4-7fdb-4bd9-8405-a37fc6c18be0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:13 crc kubenswrapper[4755]: E0320 13:50:13.883937 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-jrf8c" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.901521 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.977557 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:50:13 crc kubenswrapper[4755]: I0320 13:50:13.990468 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:13.998630 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.006105 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036161 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036569 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs" (OuterVolumeSpecName: "logs") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.036997 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037031 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037223 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") pod \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\" (UID: \"7e3b9192-0e1c-4c85-82de-3a54a4272c48\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.037855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data" (OuterVolumeSpecName: "config-data") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038241 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7e3b9192-0e1c-4c85-82de-3a54a4272c48-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038255 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.038831 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts" (OuterVolumeSpecName: "scripts") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.079522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.085467 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf" (OuterVolumeSpecName: "kube-api-access-tjbkf") pod "7e3b9192-0e1c-4c85-82de-3a54a4272c48" (UID: "7e3b9192-0e1c-4c85-82de-3a54a4272c48"). InnerVolumeSpecName "kube-api-access-tjbkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139669 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139733 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139764 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139788 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139812 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139850 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139870 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139920 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139955 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") pod \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\" (UID: \"69707be4-e338-4e13-8ecc-8cfd7cd416b2\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.139996 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140014 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140067 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140121 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140139 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140195 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") pod \"89ac3c10-1912-4807-a62e-d91f5e5682b4\" (UID: \"89ac3c10-1912-4807-a62e-d91f5e5682b4\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") pod \"1d02dff6-d832-40b7-8291-f7f08be96659\" (UID: \"1d02dff6-d832-40b7-8291-f7f08be96659\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140293 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs" (OuterVolumeSpecName: "logs") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140563 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e3b9192-0e1c-4c85-82de-3a54a4272c48-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140580 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjbkf\" (UniqueName: \"kubernetes.io/projected/7e3b9192-0e1c-4c85-82de-3a54a4272c48-kube-api-access-tjbkf\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140593 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.140603 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/7e3b9192-0e1c-4c85-82de-3a54a4272c48-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.143341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.146701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts" (OuterVolumeSpecName: "scripts") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.147303 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.148116 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl" (OuterVolumeSpecName: "kube-api-access-26ffl") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "kube-api-access-26ffl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.149416 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68" (OuterVolumeSpecName: "kube-api-access-8kh68") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "kube-api-access-8kh68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.149800 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs" (OuterVolumeSpecName: "logs") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.150048 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.150184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj" (OuterVolumeSpecName: "kube-api-access-6l4tj") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "kube-api-access-6l4tj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.155868 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.167629 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts" (OuterVolumeSpecName: "scripts") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.175341 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.182829 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.187062 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.195339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config" (OuterVolumeSpecName: "config") pod "69707be4-e338-4e13-8ecc-8cfd7cd416b2" (UID: "69707be4-e338-4e13-8ecc-8cfd7cd416b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.199884 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data" (OuterVolumeSpecName: "config-data") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.200339 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1d02dff6-d832-40b7-8291-f7f08be96659" (UID: "1d02dff6-d832-40b7-8291-f7f08be96659"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.205747 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data" (OuterVolumeSpecName: "config-data") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.218091 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "89ac3c10-1912-4807-a62e-d91f5e5682b4" (UID: "89ac3c10-1912-4807-a62e-d91f5e5682b4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242075 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242132 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242142 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ffl\" (UniqueName: \"kubernetes.io/projected/1d02dff6-d832-40b7-8291-f7f08be96659-kube-api-access-26ffl\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242153 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242163 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242173 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kh68\" (UniqueName: \"kubernetes.io/projected/89ac3c10-1912-4807-a62e-d91f5e5682b4-kube-api-access-8kh68\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242182 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242190 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6l4tj\" (UniqueName: \"kubernetes.io/projected/69707be4-e338-4e13-8ecc-8cfd7cd416b2-kube-api-access-6l4tj\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242198 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242205 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69707be4-e338-4e13-8ecc-8cfd7cd416b2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242218 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242227 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242236 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d02dff6-d832-40b7-8291-f7f08be96659-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242247 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242257 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242271 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89ac3c10-1912-4807-a62e-d91f5e5682b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242281 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/89ac3c10-1912-4807-a62e-d91f5e5682b4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.242289 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d02dff6-d832-40b7-8291-f7f08be96659-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.268608 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.274625 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.344969 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.345034 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.544821 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.552701 4755 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.552956 4755 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8zjs7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-dtggj_openstack(95c76f8c-7b76-4714-adac-6297b84d6492): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.554235 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-dtggj" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651161 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651407 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651509 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651530 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.651578 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") pod \"9f832429-63c8-4af9-b0ed-26e3f989125c\" (UID: \"9f832429-63c8-4af9-b0ed-26e3f989125c\") " Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.659449 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4" (OuterVolumeSpecName: "kube-api-access-v57f4") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "kube-api-access-v57f4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.696025 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.696697 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.698081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config" (OuterVolumeSpecName: "config") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.718881 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f832429-63c8-4af9-b0ed-26e3f989125c" (UID: "9f832429-63c8-4af9-b0ed-26e3f989125c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746860 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-52m67" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746890 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-52m67" event={"ID":"69707be4-e338-4e13-8ecc-8cfd7cd416b2","Type":"ContainerDied","Data":"312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.746947 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="312eb86a57a77517e5de582fa9a08a4920f9e622aae1ea1f3a5582c6186c3997" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749424 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749797 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1d02dff6-d832-40b7-8291-f7f08be96659","Type":"ContainerDied","Data":"c9157bd4f6fa6ecd8ab69167267a38cb410089e183f7aede955529378ccefd2c"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.749844 4755 scope.go:117] "RemoveContainer" containerID="54c2634f771f17a70ab11541cece5d4482010967a08957a83904e9db8cec79eb" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753005 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753028 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v57f4\" (UniqueName: \"kubernetes.io/projected/9f832429-63c8-4af9-b0ed-26e3f989125c-kube-api-access-v57f4\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753039 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753049 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.753071 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f832429-63c8-4af9-b0ed-26e3f989125c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.761000 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"89ac3c10-1912-4807-a62e-d91f5e5682b4","Type":"ContainerDied","Data":"fa39cf0d73d745c483cbf46583864f4385b896f58bc8b0da2a658f1a87cd2c55"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.761112 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.775328 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" event={"ID":"9f832429-63c8-4af9-b0ed-26e3f989125c","Type":"ContainerDied","Data":"267e7baeb6290269d8531900c4aac9bc633ebbbaa20000911465b29c50a00f91"} Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.775390 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6mzzb" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.778400 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8c6c7fc7-qs6b6" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.778514 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8c6c7fc7-qs6b6" event={"ID":"7e3b9192-0e1c-4c85-82de-3a54a4272c48","Type":"ContainerDied","Data":"bd6a9f656be94c41843d5494f6c9496efb34309bacbe080011d7904a3086e8c5"} Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.780987 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-jrf8c" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.781098 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-dtggj" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.864257 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.874638 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6mzzb"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.880098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.887184 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.924669 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.943237 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.943948 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944034 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944104 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="init" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944165 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="init" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944228 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944278 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944344 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944409 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944480 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944540 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944612 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944684 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: E0320 13:50:14.944753 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.944809 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945008 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" containerName="neutron-db-sync" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945073 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945134 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-log" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945206 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945262 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" containerName="glance-httpd" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.945320 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" containerName="dnsmasq-dns" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.946268 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949057 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949137 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-f4frh" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.949294 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.950508 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.950778 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.971616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:14 crc kubenswrapper[4755]: I0320 13:50:14.996913 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.004924 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.011394 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.011983 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.037307 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.044512 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8c6c7fc7-qs6b6"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.051645 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.071636 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9f7d4c74d-t7tpq"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.085458 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.085892 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.088385 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.088720 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089008 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089285 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.089846 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.090158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.090855 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091076 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091258 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091421 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.091616 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.092414 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.092557 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.195433 4755 scope.go:117] "RemoveContainer" containerID="6038e3a835045f3e0f5ff8f8a8db87186d7d3fe0d2688e07ab056cc78db0c0f8" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198886 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198945 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.198975 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199019 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199052 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199085 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199127 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199179 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199199 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199238 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199301 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.199321 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200104 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200708 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200898 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.200959 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.202455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.203991 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.208427 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.208709 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.209219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.217119 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.220768 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.221042 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.239095 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.256981 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.267139 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d02dff6-d832-40b7-8291-f7f08be96659" path="/var/lib/kubelet/pods/1d02dff6-d832-40b7-8291-f7f08be96659/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.271112 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272190 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"glance-default-internal-api-0\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272287 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e3b9192-0e1c-4c85-82de-3a54a4272c48" path="/var/lib/kubelet/pods/7e3b9192-0e1c-4c85-82de-3a54a4272c48/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.272983 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89ac3c10-1912-4807-a62e-d91f5e5682b4" path="/var/lib/kubelet/pods/89ac3c10-1912-4807-a62e-d91f5e5682b4/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.282542 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f832429-63c8-4af9-b0ed-26e3f989125c" path="/var/lib/kubelet/pods/9f832429-63c8-4af9-b0ed-26e3f989125c/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.283918 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.313942 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f52d787e-af63-491a-a3f9-2a9626a9f8b8" path="/var/lib/kubelet/pods/f52d787e-af63-491a-a3f9-2a9626a9f8b8/volumes" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.328693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.337779 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.380383 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.383046 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.387868 4755 scope.go:117] "RemoveContainer" containerID="d5017bb26ea363ba6c34ab62bd590b712b5d345891a588320489ef26de7203fc" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.428058 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.443941 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.446362 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.450374 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.450692 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-4kqwk" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.451021 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.460404 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.476095 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506065 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506153 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506192 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506223 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506291 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506349 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506368 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.506392 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.512917 4755 scope.go:117] "RemoveContainer" containerID="64ab96d03ca143323b8e4e2f9a377635a8445ea040b43a99b898631ebccbd95c" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.550498 4755 scope.go:117] "RemoveContainer" containerID="0f1313c207ca5014457d471239ce544de0401fcb5a1364130f8e15a50182a141" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.572956 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610046 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610091 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610123 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610164 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610220 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610257 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610302 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610319 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.610341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.611920 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.611961 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.612485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.612560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.613264 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.614624 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.622715 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.625554 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.633832 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"dnsmasq-dns-6b7b667979-4rsdm\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634836 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.634833 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"neutron-7cf99699dd-lg99t\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: W0320 13:50:15.651637 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74e82d0_07c7_4a72_baa4_9ec1e8427b5f.slice/crio-7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2 WatchSource:0}: Error finding container 7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2: Status 404 returned error can't find the container with id 7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2 Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.703947 4755 scope.go:117] "RemoveContainer" containerID="1b3d02ef6c9328638dff17201b5ab810e62505bd6c549c62927f2bbd73723e85" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.718637 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:50:15 crc kubenswrapper[4755]: E0320 13:50:15.745052 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/horizon-74d5b88dcf-ftnlg" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.745808 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.796851 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.816394 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"5bdd7e139d44ce5556e1d3d02993b6d72f9014d19e2934c54cab87ef83fcc71a"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.819455 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerStarted","Data":"44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.819585 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-74d5b88dcf-ftnlg" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" containerID="cri-o://44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" gracePeriod=30 Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.831369 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"fd011c06f3fffccd2ebc454db1a10f42c4b31b9cc3cdee3a458a0730af40410b"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.856114 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.858565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerStarted","Data":"7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2"} Mar 20 13:50:15 crc kubenswrapper[4755]: I0320 13:50:15.972549 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.283319 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.456092 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.509028 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.645125 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.735866 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.890849 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerStarted","Data":"df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.891275 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerStarted","Data":"a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.909180 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"2a8c9089e1e7047efefde5df6ddda04cf4071fad9b8adaef75c6dc167f6e724d"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.918829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rwsvb" podStartSLOduration=3.918647472 podStartE2EDuration="3.918647472s" podCreationTimestamp="2026-03-20 13:50:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:16.910909244 +0000 UTC m=+1196.508841773" watchObservedRunningTime="2026-03-20 13:50:16.918647472 +0000 UTC m=+1196.516580001" Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.976108 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"7c00d429c587fdfb6997848606a9ce03e91831e8523d4f6e66667f14705ac4e2"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.976159 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9f7d4c74d-t7tpq" event={"ID":"2af5836e-8c76-4432-95c0-ef34d6fc3528","Type":"ContainerStarted","Data":"2d33ff4239924776fca486654778ae26f786d68ec2a167824204dbc679fc4090"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.982235 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"c78877e8c0818f15b0f2e1b6adc8bc55f8f067e988d6b39df713c8a50ee71484"} Mar 20 13:50:16 crc kubenswrapper[4755]: I0320 13:50:16.992579 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.000559 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.003741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"e2bc4a76cdde92278100366b804ccffd9de5944afb2aabb9539badf236736c8a"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014421 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerStarted","Data":"4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014739 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fb9d46f97-rdkvb" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" containerID="cri-o://be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" gracePeriod=30 Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.014914 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-6fb9d46f97-rdkvb" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" containerID="cri-o://4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" gracePeriod=30 Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.015984 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9f7d4c74d-t7tpq" podStartSLOduration=23.015954331 podStartE2EDuration="23.015954331s" podCreationTimestamp="2026-03-20 13:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:17.005125651 +0000 UTC m=+1196.603058190" watchObservedRunningTime="2026-03-20 13:50:17.015954331 +0000 UTC m=+1196.613886860" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.041273 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerStarted","Data":"973fbf181b3cd53cfc9d88b983db4e84c26d3e70cf15d7f503f2e3da897707a2"} Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.054588 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54fd48b444-c4c9l" podStartSLOduration=23.054562286 podStartE2EDuration="23.054562286s" podCreationTimestamp="2026-03-20 13:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:17.038640475 +0000 UTC m=+1196.636573004" watchObservedRunningTime="2026-03-20 13:50:17.054562286 +0000 UTC m=+1196.652494825" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.117758 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6fb9d46f97-rdkvb" podStartSLOduration=4.446757397 podStartE2EDuration="28.117729604s" podCreationTimestamp="2026-03-20 13:49:49 +0000 UTC" firstStartedPulling="2026-03-20 13:49:50.170367898 +0000 UTC m=+1169.768300427" lastFinishedPulling="2026-03-20 13:50:13.841340105 +0000 UTC m=+1193.439272634" observedRunningTime="2026-03-20 13:50:17.069729166 +0000 UTC m=+1196.667661685" watchObservedRunningTime="2026-03-20 13:50:17.117729604 +0000 UTC m=+1196.715662143" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.979671 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.982092 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.986991 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:50:17 crc kubenswrapper[4755]: I0320 13:50:17.988111 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.011706 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089885 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.089999 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090022 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090166 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.090193 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.103576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerStarted","Data":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.121495 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" exitCode=0 Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.122058 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.197596 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.203385 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.208766 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.208832 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209092 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.209326 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.215269 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.218885 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.226302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.228273 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.244382 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.261065 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerStarted","Data":"5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.265310 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"neutron-68899c9585-6xzdq\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.293302 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.325075 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" podStartSLOduration=17.046940819 podStartE2EDuration="18.325049162s" podCreationTimestamp="2026-03-20 13:50:00 +0000 UTC" firstStartedPulling="2026-03-20 13:50:15.681373042 +0000 UTC m=+1195.279305571" lastFinishedPulling="2026-03-20 13:50:16.959481385 +0000 UTC m=+1196.557413914" observedRunningTime="2026-03-20 13:50:18.286824088 +0000 UTC m=+1197.884756607" watchObservedRunningTime="2026-03-20 13:50:18.325049162 +0000 UTC m=+1197.922981691" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.341986 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.342055 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerStarted","Data":"7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.342387 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.360920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.384580 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:18 crc kubenswrapper[4755]: I0320 13:50:18.390112 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7cf99699dd-lg99t" podStartSLOduration=3.390074389 podStartE2EDuration="3.390074389s" podCreationTimestamp="2026-03-20 13:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:18.380625285 +0000 UTC m=+1197.978557814" watchObservedRunningTime="2026-03-20 13:50:18.390074389 +0000 UTC m=+1197.988006918" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.288832 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.390088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"d570686fe60e350337cd58181076d1e8f618d5307ff29d77301f5c839ae0e2dc"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.411406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerStarted","Data":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.421445 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerStarted","Data":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.422385 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.424640 4755 generic.go:334] "Generic (PLEG): container finished" podID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerID="5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418" exitCode=0 Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.424700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerDied","Data":"5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.428325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerStarted","Data":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.450455 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" podStartSLOduration=4.450436849 podStartE2EDuration="4.450436849s" podCreationTimestamp="2026-03-20 13:50:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:19.4419312 +0000 UTC m=+1199.039863729" watchObservedRunningTime="2026-03-20 13:50:19.450436849 +0000 UTC m=+1199.048369378" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.464021 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:19 crc kubenswrapper[4755]: I0320 13:50:19.483360 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.483335147 podStartE2EDuration="5.483335147s" podCreationTimestamp="2026-03-20 13:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:19.466421092 +0000 UTC m=+1199.064353621" watchObservedRunningTime="2026-03-20 13:50:19.483335147 +0000 UTC m=+1199.081267676" Mar 20 13:50:20 crc kubenswrapper[4755]: I0320 13:50:20.441501 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e"} Mar 20 13:50:20 crc kubenswrapper[4755]: I0320 13:50:20.496778 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.496757119 podStartE2EDuration="6.496757119s" podCreationTimestamp="2026-03-20 13:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:20.4630513 +0000 UTC m=+1200.060983849" watchObservedRunningTime="2026-03-20 13:50:20.496757119 +0000 UTC m=+1200.094689648" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.035884 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.222260 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") pod \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\" (UID: \"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f\") " Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.240887 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds" (OuterVolumeSpecName: "kube-api-access-zrdds") pod "f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" (UID: "f74e82d0-07c7-4a72-baa4-9ec1e8427b5f"). InnerVolumeSpecName "kube-api-access-zrdds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.325319 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrdds\" (UniqueName: \"kubernetes.io/projected/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f-kube-api-access-zrdds\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.347209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.357335 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-2crfj"] Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.452929 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.453756 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-g8cp9" event={"ID":"f74e82d0-07c7-4a72-baa4-9ec1e8427b5f","Type":"ContainerDied","Data":"7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2"} Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.453799 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b4be9a25366dc820d0ac13c17501f6245fd5b800fa1435f2c0c2d4fd1dba0b2" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.458850 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerStarted","Data":"b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105"} Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.459344 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:21 crc kubenswrapper[4755]: I0320 13:50:21.483514 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-68899c9585-6xzdq" podStartSLOduration=4.483495271 podStartE2EDuration="4.483495271s" podCreationTimestamp="2026-03-20 13:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:21.479647362 +0000 UTC m=+1201.077579911" watchObservedRunningTime="2026-03-20 13:50:21.483495271 +0000 UTC m=+1201.081427820" Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.236401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="719824b6-7bd2-41dc-a61f-039b161a94d6" path="/var/lib/kubelet/pods/719824b6-7bd2-41dc-a61f-039b161a94d6/volumes" Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.497386 4755 generic.go:334] "Generic (PLEG): container finished" podID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerID="df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9" exitCode=0 Mar 20 13:50:23 crc kubenswrapper[4755]: I0320 13:50:23.497452 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerDied","Data":"df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9"} Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.749846 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.750593 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.870823 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:24 crc kubenswrapper[4755]: I0320 13:50:24.870876 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.343055 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.343160 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.384378 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.392803 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.489376 4755 scope.go:117] "RemoveContainer" containerID="9412ee211cf01afed52e63d1365ec0ed2b0d225ddc278755d3632e23fa6fff43" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.518030 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.518101 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.573919 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.574006 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.620691 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.629769 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.749633 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.824077 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:25 crc kubenswrapper[4755]: I0320 13:50:25.824276 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" containerID="cri-o://70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" gracePeriod=10 Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.300422 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432757 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432824 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.432999 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.433204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.439668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.439872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") pod \"5dddb768-c318-44b8-bac9-ea26f29ca038\" (UID: \"5dddb768-c318-44b8-bac9-ea26f29ca038\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.443182 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8" (OuterVolumeSpecName: "kube-api-access-xssz8") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "kube-api-access-xssz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.445803 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xssz8\" (UniqueName: \"kubernetes.io/projected/5dddb768-c318-44b8-bac9-ea26f29ca038-kube-api-access-xssz8\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.447055 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.464867 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.486362 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts" (OuterVolumeSpecName: "scripts") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.490139 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data" (OuterVolumeSpecName: "config-data") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.506622 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dddb768-c318-44b8-bac9-ea26f29ca038" (UID: "5dddb768-c318-44b8-bac9-ea26f29ca038"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547074 4755 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547101 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547110 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547119 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.547128 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5dddb768-c318-44b8-bac9-ea26f29ca038-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rwsvb" event={"ID":"5dddb768-c318-44b8-bac9-ea26f29ca038","Type":"ContainerDied","Data":"a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838"} Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571217 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61f3e56c94a7d7b55d3ff51195c6f1eef99e6529c65900adf3d0d905f6a2838" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.571283 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rwsvb" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.587616 4755 generic.go:334] "Generic (PLEG): container finished" podID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerID="70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" exitCode=0 Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.588225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f"} Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.594201 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.594370 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.658143 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750431 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750545 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750584 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750886 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.750916 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") pod \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\" (UID: \"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e\") " Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.761862 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57" (OuterVolumeSpecName: "kube-api-access-p7m57") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "kube-api-access-p7m57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.853118 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7m57\" (UniqueName: \"kubernetes.io/projected/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-kube-api-access-p7m57\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:26 crc kubenswrapper[4755]: I0320 13:50:26.995704 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.003469 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.008893 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config" (OuterVolumeSpecName: "config") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.009769 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.012792 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" (UID: "d94e1cc6-3350-4f8d-a5ac-ed606250ef2e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060007 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060523 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060534 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060544 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.060554 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.455538 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456059 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="init" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456080 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="init" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456093 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456100 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456125 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456135 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: E0320 13:50:27.456171 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456178 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456368 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" containerName="keystone-bootstrap" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456382 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.456395 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" containerName="oc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.457131 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463501 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463623 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.463913 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465503 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-hdrh5" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465516 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.465762 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467463 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467502 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467621 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467827 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.467866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.468040 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.468080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.483824 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.570944 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571037 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571110 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571172 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571193 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571222 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.571239 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.580494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-fernet-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.588665 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-scripts\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.588878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-credential-keys\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.589263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-combined-ca-bundle\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.591307 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-internal-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.595387 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-config-data\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.597529 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-public-tls-certs\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.599404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6wrx\" (UniqueName: \"kubernetes.io/projected/ab9d92e7-deba-4bdd-a267-e35fd5ec2f23-kube-api-access-h6wrx\") pod \"keystone-8f554bbf4-zvxzv\" (UID: \"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23\") " pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" event={"ID":"d94e1cc6-3350-4f8d-a5ac-ed606250ef2e","Type":"ContainerDied","Data":"f4cea975e04082628cd5787018d084f226541b1327d170cee0f4b957229de5d6"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611641 4755 scope.go:117] "RemoveContainer" containerID="70fb35799d248ec570e96b8604db17580ee93ff69528eae889eac6d41964292f" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.611861 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.618481 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.637882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerStarted","Data":"88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f"} Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.687463 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-cxr9p" podStartSLOduration=3.367121269 podStartE2EDuration="42.687436767s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.169834139 +0000 UTC m=+1166.767766658" lastFinishedPulling="2026-03-20 13:50:26.490149627 +0000 UTC m=+1206.088082156" observedRunningTime="2026-03-20 13:50:27.675727445 +0000 UTC m=+1207.273659994" watchObservedRunningTime="2026-03-20 13:50:27.687436767 +0000 UTC m=+1207.285369296" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.712274 4755 scope.go:117] "RemoveContainer" containerID="e30f36458d80c9adebf91aebb787303d5a4c021136e6f5c28c1778fb3d808295" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.753381 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.782398 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:27 crc kubenswrapper[4755]: I0320 13:50:27.802790 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-r7dnf"] Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.574976 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-8f554bbf4-zvxzv"] Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.657962 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8f554bbf4-zvxzv" event={"ID":"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23","Type":"ContainerStarted","Data":"585cb9060224708e15b0a8492d4c40abb252cdae21edc05d77427f0346747b94"} Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.658025 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:28 crc kubenswrapper[4755]: I0320 13:50:28.658051 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.211013 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.211474 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.249027 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" path="/var/lib/kubelet/pods/d94e1cc6-3350-4f8d-a5ac-ed606250ef2e/volumes" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.696273 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-8f554bbf4-zvxzv" event={"ID":"ab9d92e7-deba-4bdd-a267-e35fd5ec2f23","Type":"ContainerStarted","Data":"3f3d1cdfea6af9d8a908a412c496899ae82961d54640b17676dd0e9e650416a4"} Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.696771 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:50:29 crc kubenswrapper[4755]: I0320 13:50:29.728735 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-8f554bbf4-zvxzv" podStartSLOduration=2.728705751 podStartE2EDuration="2.728705751s" podCreationTimestamp="2026-03-20 13:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:29.717400259 +0000 UTC m=+1209.315332808" watchObservedRunningTime="2026-03-20 13:50:29.728705751 +0000 UTC m=+1209.326638280" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.108447 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.108537 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.710127 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.714794 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerDied","Data":"88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f"} Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.713635 4755 generic.go:334] "Generic (PLEG): container finished" podID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerID="88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f" exitCode=0 Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.724584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerStarted","Data":"d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3"} Mar 20 13:50:30 crc kubenswrapper[4755]: I0320 13:50:30.824328 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-jrf8c" podStartSLOduration=4.016042495 podStartE2EDuration="45.824293549s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.355031793 +0000 UTC m=+1166.952964322" lastFinishedPulling="2026-03-20 13:50:29.163282847 +0000 UTC m=+1208.761215376" observedRunningTime="2026-03-20 13:50:30.806215164 +0000 UTC m=+1210.404147693" watchObservedRunningTime="2026-03-20 13:50:30.824293549 +0000 UTC m=+1210.422226078" Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.432741 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-56df8fb6b7-r7dnf" podUID="d94e1cc6-3350-4f8d-a5ac-ed606250ef2e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.145:5353: i/o timeout" Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.735210 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerStarted","Data":"d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597"} Mar 20 13:50:31 crc kubenswrapper[4755]: I0320 13:50:31.758118 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-dtggj" podStartSLOduration=3.490697084 podStartE2EDuration="46.758100808s" podCreationTimestamp="2026-03-20 13:49:45 +0000 UTC" firstStartedPulling="2026-03-20 13:49:47.441602564 +0000 UTC m=+1167.039535093" lastFinishedPulling="2026-03-20 13:50:30.709006288 +0000 UTC m=+1210.306938817" observedRunningTime="2026-03-20 13:50:31.754058994 +0000 UTC m=+1211.351991523" watchObservedRunningTime="2026-03-20 13:50:31.758100808 +0000 UTC m=+1211.356033337" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.218813 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333371 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333491 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333624 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.333789 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") pod \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\" (UID: \"7ea35a84-68ca-4490-b1d9-fa999ef63ebe\") " Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.336085 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs" (OuterVolumeSpecName: "logs") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.354954 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr" (OuterVolumeSpecName: "kube-api-access-m97kr") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "kube-api-access-m97kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.355400 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts" (OuterVolumeSpecName: "scripts") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.364151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data" (OuterVolumeSpecName: "config-data") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.366484 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ea35a84-68ca-4490-b1d9-fa999ef63ebe" (UID: "7ea35a84-68ca-4490-b1d9-fa999ef63ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438582 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438624 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438634 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438643 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.438670 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m97kr\" (UniqueName: \"kubernetes.io/projected/7ea35a84-68ca-4490-b1d9-fa999ef63ebe-kube-api-access-m97kr\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752618 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-cxr9p" event={"ID":"7ea35a84-68ca-4490-b1d9-fa999ef63ebe","Type":"ContainerDied","Data":"7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662"} Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752693 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7119c0f11cad6efc530a4e14b913fb7cd88717b5f774e769ccfb380617e61662" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.752746 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-cxr9p" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.957186 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:32 crc kubenswrapper[4755]: E0320 13:50:32.957529 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.957544 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.962055 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" containerName="placement-db-sync" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.963877 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971457 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971715 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.971928 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.972905 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-ncc5q" Mar 20 13:50:32 crc kubenswrapper[4755]: I0320 13:50:32.973143 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.007326 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051358 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051497 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051528 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.051546 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154216 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154305 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154347 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154375 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154398 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.154445 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.156270 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0187d784-0bbe-4f5f-9b84-ee240bb90970-logs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.164102 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-combined-ca-bundle\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.164404 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-internal-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.168357 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-scripts\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.170347 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-config-data\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.175593 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0187d784-0bbe-4f5f-9b84-ee240bb90970-public-tls-certs\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.177676 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gjcb\" (UniqueName: \"kubernetes.io/projected/0187d784-0bbe-4f5f-9b84-ee240bb90970-kube-api-access-9gjcb\") pod \"placement-65884d74bb-n9mkw\" (UID: \"0187d784-0bbe-4f5f-9b84-ee240bb90970\") " pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.291202 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:33 crc kubenswrapper[4755]: I0320 13:50:33.810878 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-65884d74bb-n9mkw"] Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.751451 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.780299 4755 generic.go:334] "Generic (PLEG): container finished" podID="95c76f8c-7b76-4714-adac-6297b84d6492" containerID="d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597" exitCode=0 Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.780407 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerDied","Data":"d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597"} Mar 20 13:50:34 crc kubenswrapper[4755]: I0320 13:50:34.872345 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9f7d4c74d-t7tpq" podUID="2af5836e-8c76-4432-95c0-ef34d6fc3528" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 20 13:50:36 crc kubenswrapper[4755]: I0320 13:50:36.802019 4755 generic.go:334] "Generic (PLEG): container finished" podID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerID="d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3" exitCode=0 Mar 20 13:50:36 crc kubenswrapper[4755]: I0320 13:50:36.802093 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerDied","Data":"d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.743261 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-dtggj" event={"ID":"95c76f8c-7b76-4714-adac-6297b84d6492","Type":"ContainerDied","Data":"39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855076 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b250f5e1e7bb646a21c55743a8b2114ac7daab0c0bbdbf6157f556d4805a70" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.855136 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-dtggj" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.859751 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"186b314629f910721f0b460f6b59d2c17b3ba67fcd83d6c40f5827c4304495b1"} Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926374 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.926462 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") pod \"95c76f8c-7b76-4714-adac-6297b84d6492\" (UID: \"95c76f8c-7b76-4714-adac-6297b84d6492\") " Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.935742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7" (OuterVolumeSpecName: "kube-api-access-8zjs7") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "kube-api-access-8zjs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.971159 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:37 crc kubenswrapper[4755]: I0320 13:50:37.975488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c76f8c-7b76-4714-adac-6297b84d6492" (UID: "95c76f8c-7b76-4714-adac-6297b84d6492"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030246 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zjs7\" (UniqueName: \"kubernetes.io/projected/95c76f8c-7b76-4714-adac-6297b84d6492-kube-api-access-8zjs7\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030279 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.030290 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/95c76f8c-7b76-4714-adac-6297b84d6492-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.876727 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-jrf8c" event={"ID":"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0","Type":"ContainerDied","Data":"88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9"} Mar 20 13:50:38 crc kubenswrapper[4755]: I0320 13:50:38.877211 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.074954 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.132730 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.133412 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.133559 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133634 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.133964 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" containerName="cinder-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.134069 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" containerName="barbican-db-sync" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.135149 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.144103 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.149180 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.170556 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nvndn" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.186696 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.188867 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189716 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189851 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.189942 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.190060 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.193522 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.200318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") pod \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\" (UID: \"25bd1da4-7fdb-4bd9-8405-a37fc6c18be0\") " Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.230590 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg" (OuterVolumeSpecName: "kube-api-access-fl4vg") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "kube-api-access-fl4vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.230720 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231681 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231795 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.231980 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232020 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232140 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232286 4755 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232301 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl4vg\" (UniqueName: \"kubernetes.io/projected/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-kube-api-access-fl4vg\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.232312 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.257402 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts" (OuterVolumeSpecName: "scripts") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.282484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.388552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.436193 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data" (OuterVolumeSpecName: "config-data") pod "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" (UID: "25bd1da4-7fdb-4bd9-8405-a37fc6c18be0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480105 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480258 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480332 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480470 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480505 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480611 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480626 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.480640 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.489067 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2108220-35b4-45b7-a2bc-e93138394ff0-logs\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.502968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-combined-ca-bundle\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.510879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.514000 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.514035 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.515538 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d2108220-35b4-45b7-a2bc-e93138394ff0-config-data-custom\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.537820 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.540471 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.548752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.571057 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq97w\" (UniqueName: \"kubernetes.io/projected/d2108220-35b4-45b7-a2bc-e93138394ff0-kube-api-access-cq97w\") pod \"barbican-worker-56b9dc5449-j62ns\" (UID: \"d2108220-35b4-45b7-a2bc-e93138394ff0\") " pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585550 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585616 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585647 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585685 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585707 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585730 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585752 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585773 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585824 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.585877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.625433 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.627472 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.630323 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.637063 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688312 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688633 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688710 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688756 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688796 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688829 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688860 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688893 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688922 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.688980 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.690940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/55a78d73-f853-49d7-99b2-81c25ea6bb20-logs\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.691962 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.692302 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.692947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.693104 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data-custom\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.693327 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.695741 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-config-data\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.696024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.698305 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55a78d73-f853-49d7-99b2-81c25ea6bb20-combined-ca-bundle\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.715171 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96rzz\" (UniqueName: \"kubernetes.io/projected/55a78d73-f853-49d7-99b2-81c25ea6bb20-kube-api-access-96rzz\") pod \"barbican-keystone-listener-cbc45f8f6-z2sx8\" (UID: \"55a78d73-f853-49d7-99b2-81c25ea6bb20\") " pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.717018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"dnsmasq-dns-848cf88cfc-2kstk\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: E0320 13:50:39.726245 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.772420 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-56b9dc5449-j62ns" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791411 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791519 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.791614 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.881929 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.891338 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerStarted","Data":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.891627 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" containerID="cri-o://924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892456 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892868 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" containerID="cri-o://a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.892974 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" containerID="cri-o://6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" gracePeriod=30 Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894231 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894284 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894366 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.894405 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.896587 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.906684 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.907464 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.918031 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-jrf8c" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.918091 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"334c75e201ebd221a910b6f0d02653ea2f7672aff2f9ca41d27e206c9b0110f1"} Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.919821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.946583 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"barbican-api-7f8cb8c64b-l8cp4\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.947339 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:39 crc kubenswrapper[4755]: I0320 13:50:39.963605 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.390134 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.454953 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-56b9dc5449-j62ns"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.489224 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.562436 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.564581 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587117 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587434 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587551 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-t52g6" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.587784 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.610531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.647040 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.712865 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.714942 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.736713 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738265 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738307 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738333 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.738362 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.754127 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-cbc45f8f6-z2sx8"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839915 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839942 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.839995 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840021 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840051 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840075 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840111 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840148 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840188 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840224 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.840246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.845280 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.849857 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.851997 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.875438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.877730 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.878360 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.882541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.882949 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.885897 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"cinder-scheduler-0\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.886590 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.954009 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.955727 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.957159 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.985974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"076583b3b51e5d40037512ddc7d4c0826835eb73b1677d159b7eca71658140d7"} Mar 20 13:50:40 crc kubenswrapper[4755]: I0320 13:50:40.998359 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-65884d74bb-n9mkw" event={"ID":"0187d784-0bbe-4f5f-9b84-ee240bb90970","Type":"ContainerStarted","Data":"2e8685ea7959701a68d3522cc3c0f0316197ee7a0d886022a43a11423a58119b"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.000542 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001120 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001537 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001582 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.001839 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002028 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.002891 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.031062 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.037922 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.038202 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-65884d74bb-n9mkw" podStartSLOduration=9.038186211 podStartE2EDuration="9.038186211s" podCreationTimestamp="2026-03-20 13:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:41.03232347 +0000 UTC m=+1220.630255999" watchObservedRunningTime="2026-03-20 13:50:41.038186211 +0000 UTC m=+1220.636118740" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.038856 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.042461 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.042857 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"dnsmasq-dns-6578955fd5-656mk\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.050732 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerStarted","Data":"c99441aa9ab711c343a4fada1ebb8fbe5bc8444dea8e49f8792ccb4797464786"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.066904 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"3eb9eb785e7b462445fda35b80ffeb9c678e258078c5e90a1b14a0e2b7293256"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.078396 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"95166cf5a1430ac49b682e214261fcd32c3c05c72a9b931d676bf64e207ffdda"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100242 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" exitCode=0 Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100299 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" exitCode=2 Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100335 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.100373 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109160 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109567 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109647 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109683 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109758 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109832 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109905 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.109962 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.212836 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213161 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.213247 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.214492 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.215329 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.220088 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.225507 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.226215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.245482 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.266877 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"cinder-api-0\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.422292 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.552825 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.637961 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640767 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640872 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.640964 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641028 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.641070 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") pod \"ded8942b-87a3-49fa-80fb-dc830c09f18d\" (UID: \"ded8942b-87a3-49fa-80fb-dc830c09f18d\") " Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.649067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.649158 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.664130 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x" (OuterVolumeSpecName: "kube-api-access-zxd7x") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "kube-api-access-zxd7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.691191 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts" (OuterVolumeSpecName: "scripts") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755628 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755687 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755705 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ded8942b-87a3-49fa-80fb-dc830c09f18d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.755717 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxd7x\" (UniqueName: \"kubernetes.io/projected/ded8942b-87a3-49fa-80fb-dc830c09f18d-kube-api-access-zxd7x\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.791616 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.899812 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.970270 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.989774 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:41 crc kubenswrapper[4755]: I0320 13:50:41.997184 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.023812 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data" (OuterVolumeSpecName: "config-data") pod "ded8942b-87a3-49fa-80fb-dc830c09f18d" (UID: "ded8942b-87a3-49fa-80fb-dc830c09f18d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.094661 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.094706 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ded8942b-87a3-49fa-80fb-dc830c09f18d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.112042 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.117866 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerStarted","Data":"0ac451aa4ed8d677d77946a6a4c4490aa16c5aad1720a8d22a9ecfc0acddbe6e"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.120879 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.121023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerStarted","Data":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127083 4755 generic.go:334] "Generic (PLEG): container finished" podID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" exitCode=0 Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127179 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127219 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ded8942b-87a3-49fa-80fb-dc830c09f18d","Type":"ContainerDied","Data":"aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127242 4755 scope.go:117] "RemoveContainer" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.127434 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.134103 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"9bab4e710eb88089079a33e7c9e02e8a26667fa5d5cf668bd290bf2ec7796c32"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.144968 4755 generic.go:334] "Generic (PLEG): container finished" podID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerID="26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b" exitCode=0 Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.145974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerDied","Data":"26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b"} Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.157867 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podStartSLOduration=3.157839921 podStartE2EDuration="3.157839921s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:42.146199821 +0000 UTC m=+1221.744132350" watchObservedRunningTime="2026-03-20 13:50:42.157839921 +0000 UTC m=+1221.755772450" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.180255 4755 scope.go:117] "RemoveContainer" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.294138 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.308927 4755 scope.go:117] "RemoveContainer" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.315837 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.326725 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327466 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327494 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327534 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.327576 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327585 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327878 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="proxy-httpd" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327896 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="sg-core" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.327920 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" containerName="ceilometer-notification-agent" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.330356 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.332592 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.333470 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.347926 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.398882 4755 scope.go:117] "RemoveContainer" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.399439 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": container with ID starting with a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9 not found: ID does not exist" containerID="a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.399477 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9"} err="failed to get container status \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": rpc error: code = NotFound desc = could not find container \"a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9\": container with ID starting with a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9 not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.399507 4755 scope.go:117] "RemoveContainer" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.399983 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": container with ID starting with 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 not found: ID does not exist" containerID="6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400068 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3"} err="failed to get container status \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": rpc error: code = NotFound desc = could not find container \"6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3\": container with ID starting with 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400112 4755 scope.go:117] "RemoveContainer" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: E0320 13:50:42.400696 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": container with ID starting with 924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f not found: ID does not exist" containerID="924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.400729 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f"} err="failed to get container status \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": rpc error: code = NotFound desc = could not find container \"924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f\": container with ID starting with 924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f not found: ID does not exist" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410416 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410469 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410512 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410545 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410695 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410734 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.410777 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.512620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513555 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513661 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513702 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.513749 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.514518 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.515299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.519622 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.521569 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.533348 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.533821 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.543432 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"ceilometer-0\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.643140 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.660059 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.728605 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729147 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729296 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729552 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729707 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.729880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") pod \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\" (UID: \"f9333e7f-e263-450e-8a0e-0e788a57fd6d\") " Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.734221 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg" (OuterVolumeSpecName: "kube-api-access-s2mdg") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "kube-api-access-s2mdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.759678 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.764374 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.766607 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.792801 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config" (OuterVolumeSpecName: "config") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833595 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833638 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833673 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2mdg\" (UniqueName: \"kubernetes.io/projected/f9333e7f-e263-450e-8a0e-0e788a57fd6d-kube-api-access-s2mdg\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833684 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.833693 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.867734 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9333e7f-e263-450e-8a0e-0e788a57fd6d" (UID: "f9333e7f-e263-450e-8a0e-0e788a57fd6d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:42 crc kubenswrapper[4755]: I0320 13:50:42.935711 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9333e7f-e263-450e-8a0e-0e788a57fd6d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165694 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" event={"ID":"f9333e7f-e263-450e-8a0e-0e788a57fd6d","Type":"ContainerDied","Data":"c99441aa9ab711c343a4fada1ebb8fbe5bc8444dea8e49f8792ccb4797464786"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165752 4755 scope.go:117] "RemoveContainer" containerID="26b3d21f098f4badc81156151801e1a17d3f572d355233ff058eec05b7d0ce8b" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.165952 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-2kstk" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.177027 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerID="74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b" exitCode=0 Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.177201 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.186556 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"a07bbef33f9bf32ac084e64d0d3ad49faa7ce5c4a2cf17fd44091e1ae8ffdb19"} Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.192842 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.192883 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.278999 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ded8942b-87a3-49fa-80fb-dc830c09f18d" path="/var/lib/kubelet/pods/ded8942b-87a3-49fa-80fb-dc830c09f18d/volumes" Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.281103 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.289212 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-2kstk"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.364896 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:43 crc kubenswrapper[4755]: I0320 13:50:43.595672 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:44 crc kubenswrapper[4755]: I0320 13:50:44.214705 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.279068 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" path="/var/lib/kubelet/pods/f9333e7f-e263-450e-8a0e-0e788a57fd6d/volumes" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350673 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"63a07ea9507732987fa76339a9da53fdf9074739ca064b32427bb55875628827"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerStarted","Data":"fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.350774 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"5f182f3057b3a41f85d8264583c773d7a77cb3fa05eac8eaf7a619fde59214e4"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.352025 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"fe0020f726940f7dec3507ad4c9a928afdb07065e96f86a88720feb42e86a8e0"} Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.447699 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-656mk" podStartSLOduration=5.447683306 podStartE2EDuration="5.447683306s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:45.417961 +0000 UTC m=+1225.015893539" watchObservedRunningTime="2026-03-20 13:50:45.447683306 +0000 UTC m=+1225.045615835" Mar 20 13:50:45 crc kubenswrapper[4755]: I0320 13:50:45.828766 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.201451 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.202173 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" containerID="cri-o://2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.202754 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" containerID="cri-o://b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243200 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:46 crc kubenswrapper[4755]: E0320 13:50:46.243682 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243696 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.243909 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9333e7f-e263-450e-8a0e-0e788a57fd6d" containerName="init" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.253680 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.264853 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.265129 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.279075 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.289368 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": read tcp 10.217.0.2:40692->10.217.0.159:9696: read: connection reset by peer" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.294985 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.297392 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.321981 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401360 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401500 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401561 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401582 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401667 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401713 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401759 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.401970 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402100 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402136 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402160 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402257 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402315 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.402359 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424119 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerStarted","Data":"7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424312 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" containerID="cri-o://57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.424925 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" containerID="cri-o://7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" gracePeriod=30 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.462244 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.462206085 podStartE2EDuration="6.462206085s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:46.449241091 +0000 UTC m=+1226.047173610" watchObservedRunningTime="2026-03-20 13:50:46.462206085 +0000 UTC m=+1226.060138614" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.472934 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" event={"ID":"55a78d73-f853-49d7-99b2-81c25ea6bb20","Type":"ContainerStarted","Data":"c18f9b9b717a7727cb85e80d5965524cb055402bc4a7cd8d667cf040706879ee"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.502253 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-56b9dc5449-j62ns" event={"ID":"d2108220-35b4-45b7-a2bc-e93138394ff0","Type":"ContainerStarted","Data":"6f77d1934c6dad8258402daa2e37c2a1f5cc3ef9b1ec6c01ec3c5558baf3c451"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504198 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504324 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504341 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504417 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504439 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504544 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504570 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.504620 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.519767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.534341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.553315 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-public-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.553950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-public-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.554211 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45fa2a85-b7d9-413c-827c-fdcbcec05faf-logs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.555845 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-cbc45f8f6-z2sx8" podStartSLOduration=3.7609589359999998 podStartE2EDuration="7.555818668s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.784817431 +0000 UTC m=+1220.382749950" lastFinishedPulling="2026-03-20 13:50:44.579677163 +0000 UTC m=+1224.177609682" observedRunningTime="2026-03-20 13:50:46.55436726 +0000 UTC m=+1226.152299789" watchObservedRunningTime="2026-03-20 13:50:46.555818668 +0000 UTC m=+1226.153751197" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.557086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-internal-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.562767 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq8n4\" (UniqueName: \"kubernetes.io/projected/0263cee7-e9d5-48ff-8326-7455a95311a6-kube-api-access-dq8n4\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.564124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-internal-tls-certs\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.564323 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.578643 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-ovndb-tls-certs\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.579565 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-combined-ca-bundle\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.580151 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/0263cee7-e9d5-48ff-8326-7455a95311a6-httpd-config\") pod \"neutron-754b98cbff-jgntp\" (UID: \"0263cee7-e9d5-48ff-8326-7455a95311a6\") " pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.586144 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-config-data-custom\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.603260 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45fa2a85-b7d9-413c-827c-fdcbcec05faf-combined-ca-bundle\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.641855 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbvbk\" (UniqueName: \"kubernetes.io/projected/45fa2a85-b7d9-413c-827c-fdcbcec05faf-kube-api-access-qbvbk\") pod \"barbican-api-7769db74db-f4kfh\" (UID: \"45fa2a85-b7d9-413c-827c-fdcbcec05faf\") " pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.658895 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.679542 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.690728 4755 generic.go:334] "Generic (PLEG): container finished" podID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerID="44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" exitCode=137 Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.691398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerDied","Data":"44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2"} Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.884214 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.897098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929291 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929415 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929443 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929560 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.929690 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") pod \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\" (UID: \"2f75cbbe-c852-4090-aca4-42cd87a3a9b3\") " Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.930469 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs" (OuterVolumeSpecName: "logs") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.938388 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n" (OuterVolumeSpecName: "kube-api-access-mv45n") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "kube-api-access-mv45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.961906 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:46 crc kubenswrapper[4755]: I0320 13:50:46.963953 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-56b9dc5449-j62ns" podStartSLOduration=3.871998037 podStartE2EDuration="7.963927216s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.488938185 +0000 UTC m=+1220.086870714" lastFinishedPulling="2026-03-20 13:50:44.580867364 +0000 UTC m=+1224.178799893" observedRunningTime="2026-03-20 13:50:46.604231256 +0000 UTC m=+1226.202163775" watchObservedRunningTime="2026-03-20 13:50:46.963927216 +0000 UTC m=+1226.561859745" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034302 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034733 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.034743 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mv45n\" (UniqueName: \"kubernetes.io/projected/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-kube-api-access-mv45n\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.068444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts" (OuterVolumeSpecName: "scripts") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.110406 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data" (OuterVolumeSpecName: "config-data") pod "2f75cbbe-c852-4090-aca4-42cd87a3a9b3" (UID: "2f75cbbe-c852-4090-aca4-42cd87a3a9b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.137869 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.138262 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2f75cbbe-c852-4090-aca4-42cd87a3a9b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154086 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-conmon-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-conmon-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154136 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154158 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-conmon-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-conmon-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.154175 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c76f8c_7b76_4714_adac_6297b84d6492.slice/crio-d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.155148 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.160027 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-a685bf9b5b3783737682242dee08d77aa716471d14cd21f056944a975a3c87a9.scope: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.167827 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3.scope WatchSource:0}: Error finding container 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3: Status 404 returned error can't find the container with id 6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3 Mar 20 13:50:47 crc kubenswrapper[4755]: W0320 13:50:47.201229 4755 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9333e7f_e263_450e_8a0e_0e788a57fd6d.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9333e7f_e263_450e_8a0e_0e788a57fd6d.slice: no such file or directory Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718428 4755 generic.go:334] "Generic (PLEG): container finished" podID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerID="4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" exitCode=137 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718898 4755 generic.go:334] "Generic (PLEG): container finished" podID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerID="be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" exitCode=137 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.718972 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.720924 4755 generic.go:334] "Generic (PLEG): container finished" podID="58518373-7b53-4ecc-bc83-3982b7688219" containerID="7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" exitCode=0 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.720967 4755 generic.go:334] "Generic (PLEG): container finished" podID="58518373-7b53-4ecc-bc83-3982b7688219" containerID="57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" exitCode=143 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.721021 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.721056 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.722603 4755 generic.go:334] "Generic (PLEG): container finished" podID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerID="b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" exitCode=0 Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.722675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.727827 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.748028 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerStarted","Data":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.764005 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789722 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-74d5b88dcf-ftnlg" event={"ID":"2f75cbbe-c852-4090-aca4-42cd87a3a9b3","Type":"ContainerDied","Data":"935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb"} Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789791 4755 scope.go:117] "RemoveContainer" containerID="44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.789983 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-74d5b88dcf-ftnlg" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.816393 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.199341486 podStartE2EDuration="7.816353268s" podCreationTimestamp="2026-03-20 13:50:40 +0000 UTC" firstStartedPulling="2026-03-20 13:50:41.965567815 +0000 UTC m=+1221.563500344" lastFinishedPulling="2026-03-20 13:50:44.582579597 +0000 UTC m=+1224.180512126" observedRunningTime="2026-03-20 13:50:47.76909018 +0000 UTC m=+1227.367022709" watchObservedRunningTime="2026-03-20 13:50:47.816353268 +0000 UTC m=+1227.414285787" Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.904718 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:50:47 crc kubenswrapper[4755]: I0320 13:50:47.912408 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-74d5b88dcf-ftnlg"] Mar 20 13:50:48 crc kubenswrapper[4755]: E0320 13:50:48.138190 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-aa085dd33c3482191b77bb52afe821f7a2bdb127a838b142a6cc26125ebe5f9e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-6d6a55ac85711c6043e926e1bdecf78feab6ef2010b3453c18c706ab82eb39d3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-935676461a0f68fb05e2cbff2d17aad3ec596d2702ca73c0eb704a8f7b9a97bb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice/crio-conmon-924a39603d89f498b39d2c57aada2b676d98736ae4e2b19141d4ea632a9ffe8f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-conmon-4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice/crio-88b84b3180a073841694c8b0a9f0f0c3cd93801c9cf952e6bff7218c39cb9cd9\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-conmon-44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba56df0_ceeb_40c0_b1b0_15bb4d548b80.slice/crio-4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f75cbbe_c852_4090_aca4_42cd87a3a9b3.slice/crio-44482e282fc071fab25875e63fa2afb8682b4c3355ce3a4bcb9cd7632d4d8ef2.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25bd1da4_7fdb_4bd9_8405_a37fc6c18be0.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podded8942b_87a3_49fa_80fb_dc830c09f18d.slice\": RecentStats: unable to find data in memory cache]" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.144938 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7769db74db-f4kfh"] Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.163606 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.269430 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-754b98cbff-jgntp"] Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.364894 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.389929 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-68899c9585-6xzdq" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.159:9696/\": dial tcp 10.217.0.159:9696: connect: connection refused" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.489889 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.489966 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490085 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490181 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.490262 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") pod \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\" (UID: \"cba56df0-ceeb-40c0-b1b0-15bb4d548b80\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.497218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs" (OuterVolumeSpecName: "logs") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.497710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.503134 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh" (OuterVolumeSpecName: "kube-api-access-rn2fh") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "kube-api-access-rn2fh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.517314 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts" (OuterVolumeSpecName: "scripts") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.518452 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data" (OuterVolumeSpecName: "config-data") pod "cba56df0-ceeb-40c0-b1b0-15bb4d548b80" (UID: "cba56df0-ceeb-40c0-b1b0-15bb4d548b80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.534530 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594145 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594861 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn2fh\" (UniqueName: \"kubernetes.io/projected/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-kube-api-access-rn2fh\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.594980 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.595047 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.595100 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cba56df0-ceeb-40c0-b1b0-15bb4d548b80-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707124 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707321 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.707381 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711455 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711618 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.711801 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") pod \"58518373-7b53-4ecc-bc83-3982b7688219\" (UID: \"58518373-7b53-4ecc-bc83-3982b7688219\") " Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.712232 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs" (OuterVolumeSpecName: "logs") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.712802 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.713012 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/58518373-7b53-4ecc-bc83-3982b7688219-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.718880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.723675 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts" (OuterVolumeSpecName: "scripts") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.730945 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd" (OuterVolumeSpecName: "kube-api-access-rdsdd") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "kube-api-access-rdsdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.753525 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815667 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/58518373-7b53-4ecc-bc83-3982b7688219-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815701 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdsdd\" (UniqueName: \"kubernetes.io/projected/58518373-7b53-4ecc-bc83-3982b7688219-kube-api-access-rdsdd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815713 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815722 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.815732 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.824710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data" (OuterVolumeSpecName: "config-data") pod "58518373-7b53-4ecc-bc83-3982b7688219" (UID: "58518373-7b53-4ecc-bc83-3982b7688219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828710 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6fb9d46f97-rdkvb" event={"ID":"cba56df0-ceeb-40c0-b1b0-15bb4d548b80","Type":"ContainerDied","Data":"c48465acdb60bb091f412e1013ff879827172972e8accd680b44ab98c9925827"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828781 4755 scope.go:117] "RemoveContainer" containerID="4f045f7f54b4b1a7ab389541498fff14b458defc4a7ed0a92f52bfc8e38f3a08" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.828903 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6fb9d46f97-rdkvb" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.836273 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.837105 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"58518373-7b53-4ecc-bc83-3982b7688219","Type":"ContainerDied","Data":"a07bbef33f9bf32ac084e64d0d3ad49faa7ce5c4a2cf17fd44091e1ae8ffdb19"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.848453 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"510c291a7c70e7aad56901b2d2d28fa193f11c28bd50136a3b75868deba06c96"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.848504 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"f50d00ada2da3bbe1664473ead894d1b915667651d38013543c1f0dedb6ccc75"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.862832 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"bf095427e3eb80033429a5e7a833eaecddcfcfa1eb35f4693e34708cbf4e8b51"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.862896 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"ebfb7d279d9d32922e2526035dc4a22dadb837238236158cb6c10a364f8c1547"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.867049 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006"} Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.930919 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58518373-7b53-4ecc-bc83-3982b7688219-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:48 crc kubenswrapper[4755]: I0320 13:50:48.984029 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.016961 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.052962 4755 scope.go:117] "RemoveContainer" containerID="be9f26e3425fa0966666fff2d3b262091f7a9cc1352bebc4f623e8d2b43784ed" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.072361 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074001 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074020 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074030 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074036 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074084 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074094 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074113 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074474 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: E0320 13:50:49.074511 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074520 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074788 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074804 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074843 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074853 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" containerName="horizon" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.074865 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="58518373-7b53-4ecc-bc83-3982b7688219" containerName="cinder-api-log" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.076378 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082006 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082356 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.082562 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.128785 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.166734 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.175779 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6fb9d46f97-rdkvb"] Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.208560 4755 scope.go:117] "RemoveContainer" containerID="7c1d553e5ebb1f501069816a6eb05ad12ec56c676e250d918a2336d6430b65c3" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240035 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240168 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240203 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240235 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240268 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240359 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.240454 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.264905 4755 scope.go:117] "RemoveContainer" containerID="57526a2c0076d9b294d728b9745d8da4e417f75ccaff33280cbac60be779a948" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.319978 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f75cbbe-c852-4090-aca4-42cd87a3a9b3" path="/var/lib/kubelet/pods/2f75cbbe-c852-4090-aca4-42cd87a3a9b3/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.320723 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58518373-7b53-4ecc-bc83-3982b7688219" path="/var/lib/kubelet/pods/58518373-7b53-4ecc-bc83-3982b7688219/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.321850 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba56df0-ceeb-40c0-b1b0-15bb4d548b80" path="/var/lib/kubelet/pods/cba56df0-ceeb-40c0-b1b0-15bb4d548b80/volumes" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345466 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345516 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345552 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345578 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345617 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345679 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.345715 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.348024 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9894c7cb-7899-4354-a6c2-e7339eb1f765-etc-machine-id\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.351421 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9894c7cb-7899-4354-a6c2-e7339eb1f765-logs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.353541 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data-custom\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.357054 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-scripts\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.360738 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-config-data\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.367205 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.373188 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp97l\" (UniqueName: \"kubernetes.io/projected/9894c7cb-7899-4354-a6c2-e7339eb1f765-kube-api-access-kp97l\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.378199 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-public-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.385674 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9894c7cb-7899-4354-a6c2-e7339eb1f765-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"9894c7cb-7899-4354-a6c2-e7339eb1f765\") " pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.477225 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.897271 4755 generic.go:334] "Generic (PLEG): container finished" podID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerID="2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" exitCode=0 Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.897757 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.907786 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-754b98cbff-jgntp" event={"ID":"0263cee7-e9d5-48ff-8326-7455a95311a6","Type":"ContainerStarted","Data":"3c5180748278247e8682e0b8caf44998c1cfcc0ac7e8999b2bea12bcc98ec95c"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.909602 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.944866 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7769db74db-f4kfh" event={"ID":"45fa2a85-b7d9-413c-827c-fdcbcec05faf","Type":"ContainerStarted","Data":"57a1b5bf9d00fce8765d2aec48b5c36112e1e206c4ea6824c6387bd16ca8cafd"} Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.946540 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.946583 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:49 crc kubenswrapper[4755]: I0320 13:50:49.949200 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-754b98cbff-jgntp" podStartSLOduration=3.949183832 podStartE2EDuration="3.949183832s" podCreationTimestamp="2026-03-20 13:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:49.937251474 +0000 UTC m=+1229.535184013" watchObservedRunningTime="2026-03-20 13:50:49.949183832 +0000 UTC m=+1229.547116361" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.056527 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7769db74db-f4kfh" podStartSLOduration=4.056509187 podStartE2EDuration="4.056509187s" podCreationTimestamp="2026-03-20 13:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:49.969031093 +0000 UTC m=+1229.566963622" watchObservedRunningTime="2026-03-20 13:50:50.056509187 +0000 UTC m=+1229.654441716" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.059153 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.221885 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.378142 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472585 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472867 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.472982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473017 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473040 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473128 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.473187 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") pod \"a4c0d88b-a127-41a4-824c-e09a285a5a62\" (UID: \"a4c0d88b-a127-41a4-824c-e09a285a5a62\") " Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.490356 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67" (OuterVolumeSpecName: "kube-api-access-zwv67") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "kube-api-access-zwv67". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.505874 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.539959 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576639 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576694 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.576704 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwv67\" (UniqueName: \"kubernetes.io/projected/a4c0d88b-a127-41a4-824c-e09a285a5a62-kube-api-access-zwv67\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.616444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.619772 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.643764 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config" (OuterVolumeSpecName: "config") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.664758 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a4c0d88b-a127-41a4-824c-e09a285a5a62" (UID: "a4c0d88b-a127-41a4-824c-e09a285a5a62"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.672480 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9f7d4c74d-t7tpq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680098 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680156 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680172 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.680187 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4c0d88b-a127-41a4-824c-e09a285a5a62-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.768554 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957079 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-68899c9585-6xzdq" event={"ID":"a4c0d88b-a127-41a4-824c-e09a285a5a62","Type":"ContainerDied","Data":"d570686fe60e350337cd58181076d1e8f618d5307ff29d77301f5c839ae0e2dc"} Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957140 4755 scope.go:117] "RemoveContainer" containerID="b5a0ea0e8d30cc9dca1e1e5f42d6944615a0eef7af5ab5aeda5e4fe6dc137105" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.957149 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-68899c9585-6xzdq" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.959375 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.961406 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" containerID="cri-o://50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" gracePeriod=30 Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.961626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"39efe372539eba9227bd4e28c3450c18b0f33b1a15d13f52fba8d4ef0c2c0e9c"} Mar 20 13:50:50 crc kubenswrapper[4755]: I0320 13:50:50.962634 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" containerID="cri-o://defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" gracePeriod=30 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.029745 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.038104 4755 scope.go:117] "RemoveContainer" containerID="2e2b3273e77fab2188612b2337af02ce0c6d995c9efe8ca56e544749db00262e" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.050689 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-68899c9585-6xzdq"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.111517 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.198464 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.198735 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" containerID="cri-o://54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" gracePeriod=10 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.289914 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" path="/var/lib/kubelet/pods/a4c0d88b-a127-41a4-824c-e09a285a5a62/volumes" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.290634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.486456 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.671391 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709679 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709745 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709770 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709842 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709942 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.709998 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") pod \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\" (UID: \"ee532ae9-f63a-4f8c-82db-3d81014a6e05\") " Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.717431 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr" (OuterVolumeSpecName: "kube-api-access-kq9xr") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "kube-api-access-kq9xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.761489 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.764955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.770129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.796315 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config" (OuterVolumeSpecName: "config") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812624 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812689 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812703 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.812715 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kq9xr\" (UniqueName: \"kubernetes.io/projected/ee532ae9-f63a-4f8c-82db-3d81014a6e05-kube-api-access-kq9xr\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.816444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.823061 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ee532ae9-f63a-4f8c-82db-3d81014a6e05" (UID: "ee532ae9-f63a-4f8c-82db-3d81014a6e05"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.913889 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.913916 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ee532ae9-f63a-4f8c-82db-3d81014a6e05-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.979565 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerStarted","Data":"14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.980699 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981844 4755 generic.go:334] "Generic (PLEG): container finished" podID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" exitCode=0 Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981902 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981924 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" event={"ID":"ee532ae9-f63a-4f8c-82db-3d81014a6e05","Type":"ContainerDied","Data":"973fbf181b3cd53cfc9d88b983db4e84c26d3e70cf15d7f503f2e3da897707a2"} Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.981941 4755 scope.go:117] "RemoveContainer" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.982082 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-4rsdm" Mar 20 13:50:51 crc kubenswrapper[4755]: I0320 13:50:51.985974 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"fc400527a488146e60c86d906397fc6d294b199a76c49bdc269ea31150823810"} Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.003814 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.244181571 podStartE2EDuration="10.003787589s" podCreationTimestamp="2026-03-20 13:50:42 +0000 UTC" firstStartedPulling="2026-03-20 13:50:44.469874013 +0000 UTC m=+1224.067806542" lastFinishedPulling="2026-03-20 13:50:51.229480031 +0000 UTC m=+1230.827412560" observedRunningTime="2026-03-20 13:50:52.00345698 +0000 UTC m=+1231.601389509" watchObservedRunningTime="2026-03-20 13:50:52.003787589 +0000 UTC m=+1231.601720118" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.062111 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.068817 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.075225 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-4rsdm"] Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.085639 4755 scope.go:117] "RemoveContainer" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.129874 4755 scope.go:117] "RemoveContainer" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:52 crc kubenswrapper[4755]: E0320 13:50:52.131966 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": container with ID starting with 54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c not found: ID does not exist" containerID="54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132002 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c"} err="failed to get container status \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": rpc error: code = NotFound desc = could not find container \"54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c\": container with ID starting with 54db3fc726750118a5107ec2a36ed15dbe4673bc5364be729ba176562324590c not found: ID does not exist" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132029 4755 scope.go:117] "RemoveContainer" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: E0320 13:50:52.132233 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": container with ID starting with 418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89 not found: ID does not exist" containerID="418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89" Mar 20 13:50:52 crc kubenswrapper[4755]: I0320 13:50:52.132255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89"} err="failed to get container status \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": rpc error: code = NotFound desc = could not find container \"418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89\": container with ID starting with 418c0d9893d41892d1e63aada1cdad3c95bc3429ae0c38cb1a6ee93b56075b89 not found: ID does not exist" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.024809 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"9894c7cb-7899-4354-a6c2-e7339eb1f765","Type":"ContainerStarted","Data":"de5eff2016acb7db7708d693cc4fee0449e99b199af2bf67fb6642781cb1cebd"} Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025012 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" containerID="cri-o://4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" gracePeriod=30 Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025721 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.025779 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" containerID="cri-o://7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" gracePeriod=30 Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.070813 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.0707903 podStartE2EDuration="5.0707903s" podCreationTimestamp="2026-03-20 13:50:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:53.05409106 +0000 UTC m=+1232.652023599" watchObservedRunningTime="2026-03-20 13:50:53.0707903 +0000 UTC m=+1232.668722829" Mar 20 13:50:53 crc kubenswrapper[4755]: I0320 13:50:53.263686 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" path="/var/lib/kubelet/pods/ee532ae9-f63a-4f8c-82db-3d81014a6e05/volumes" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.030676 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" exitCode=0 Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.030691 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.749350 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.878292 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992766 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992828 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992854 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992910 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.992931 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.993055 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") pod \"b9649141-1c8e-4387-8cfc-81d60abf76f3\" (UID: \"b9649141-1c8e-4387-8cfc-81d60abf76f3\") " Mar 20 13:50:54 crc kubenswrapper[4755]: I0320 13:50:54.993892 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.000545 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9" (OuterVolumeSpecName: "kube-api-access-9xpt9") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "kube-api-access-9xpt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.002993 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts" (OuterVolumeSpecName: "scripts") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.028825 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.087912 4755 generic.go:334] "Generic (PLEG): container finished" podID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088006 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088035 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"b9649141-1c8e-4387-8cfc-81d60abf76f3","Type":"ContainerDied","Data":"9bab4e710eb88089079a33e7c9e02e8a26667fa5d5cf668bd290bf2ec7796c32"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088056 4755 scope.go:117] "RemoveContainer" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.088236 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098096 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098126 4755 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/b9649141-1c8e-4387-8cfc-81d60abf76f3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098137 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.098149 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xpt9\" (UniqueName: \"kubernetes.io/projected/b9649141-1c8e-4387-8cfc-81d60abf76f3-kube-api-access-9xpt9\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.140889 4755 generic.go:334] "Generic (PLEG): container finished" podID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.143099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.166995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.201423 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.253642 4755 scope.go:117] "RemoveContainer" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.286097 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data" (OuterVolumeSpecName: "config-data") pod "b9649141-1c8e-4387-8cfc-81d60abf76f3" (UID: "b9649141-1c8e-4387-8cfc-81d60abf76f3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.296928 4755 scope.go:117] "RemoveContainer" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.300768 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": container with ID starting with 7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860 not found: ID does not exist" containerID="7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.300931 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860"} err="failed to get container status \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": rpc error: code = NotFound desc = could not find container \"7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860\": container with ID starting with 7728d736efc3b0e178c833b0a909c74e314e02b782bd0c8e0a907322666eb860 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.301011 4755 scope.go:117] "RemoveContainer" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.302946 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9649141-1c8e-4387-8cfc-81d60abf76f3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.303142 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": container with ID starting with 4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b not found: ID does not exist" containerID="4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.303232 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b"} err="failed to get container status \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": rpc error: code = NotFound desc = could not find container \"4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b\": container with ID starting with 4fd39590efeebddfaf19c2a8b786d8e237cab1e0eafd60e2739d4ce8327fbf8b not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.418806 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.432441 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.447907 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448253 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="init" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448269 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="init" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448282 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448288 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448300 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448306 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448326 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448332 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448344 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448351 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: E0320 13:50:55.448362 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448368 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-httpd" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448537 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="probe" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448549 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee532ae9-f63a-4f8c-82db-3d81014a6e05" containerName="dnsmasq-dns" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448564 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4c0d88b-a127-41a4-824c-e09a285a5a62" containerName="neutron-api" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.448578 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" containerName="cinder-scheduler" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.449438 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.453960 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.462460 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.506787 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507043 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507117 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507214 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.507408 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609084 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.609960 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.610645 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.611272 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.612144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.612335 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.615455 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-scripts\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.616011 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.616899 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-config-data\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.615517 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.630233 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27t6t\" (UniqueName: \"kubernetes.io/projected/df39e954-98b1-4c7c-bc51-5c2ee4db8a6d-kube-api-access-27t6t\") pod \"cinder-scheduler-0\" (UID: \"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d\") " pod="openstack/cinder-scheduler-0" Mar 20 13:50:55 crc kubenswrapper[4755]: I0320 13:50:55.779509 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:50:56 crc kubenswrapper[4755]: I0320 13:50:56.406767 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.192767 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"29dab42a4ce34544dcbff3f9d7bf78f6406ef710c481cc96b3250981650aebc9"} Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.193538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"8ffc3caa8314737810ddb54f5d0eeeaaceaecf61b2e3c37e7d1af0e39ccd4b1d"} Mar 20 13:50:57 crc kubenswrapper[4755]: I0320 13:50:57.253356 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9649141-1c8e-4387-8cfc-81d60abf76f3" path="/var/lib/kubelet/pods/b9649141-1c8e-4387-8cfc-81d60abf76f3/volumes" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.204717 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"df39e954-98b1-4c7c-bc51-5c2ee4db8a6d","Type":"ContainerStarted","Data":"5dee25b7c414f01aba2ffc1c56e8fcb3fd3f862b3df29766bfd1850c0e50dc27"} Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.237137 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.237117713 podStartE2EDuration="3.237117713s" podCreationTimestamp="2026-03-20 13:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:50:58.23429258 +0000 UTC m=+1237.832225109" watchObservedRunningTime="2026-03-20 13:50:58.237117713 +0000 UTC m=+1237.835050242" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.475942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.850335 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7769db74db-f4kfh" Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960222 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960780 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" containerID="cri-o://901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" gracePeriod=30 Mar 20 13:50:58 crc kubenswrapper[4755]: I0320 13:50:58.960973 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" containerID="cri-o://58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" gracePeriod=30 Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.250998 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" exitCode=143 Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.252131 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} Mar 20 13:50:59 crc kubenswrapper[4755]: I0320 13:50:59.731322 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-8f554bbf4-zvxzv" Mar 20 13:51:00 crc kubenswrapper[4755]: I0320 13:51:00.780590 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.014851 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.016702 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020444 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020955 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nvpnw" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.020997 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.037537 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040164 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040224 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040266 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.040298 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142264 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142327 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142367 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.142401 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.143496 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.164522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-openstack-config-secret\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.167368 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96136572-ead6-4771-bd36-eec29b5fb137-combined-ca-bundle\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.167813 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f69f9\" (UniqueName: \"kubernetes.io/projected/96136572-ead6-4771-bd36-eec29b5fb137-kube-api-access-f69f9\") pod \"openstackclient\" (UID: \"96136572-ead6-4771-bd36-eec29b5fb137\") " pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.340904 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.435824 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:51:01 crc kubenswrapper[4755]: I0320 13:51:01.896837 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:51:02 crc kubenswrapper[4755]: I0320 13:51:02.280920 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"96136572-ead6-4771-bd36-eec29b5fb137","Type":"ContainerStarted","Data":"ffcf9257ec195bcf122894453f661953d7455e45a95cd3f0c29abf85530ccf53"} Mar 20 13:51:03 crc kubenswrapper[4755]: I0320 13:51:03.767054 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:35668->10.217.0.165:9311: read: connection reset by peer" Mar 20 13:51:03 crc kubenswrapper[4755]: I0320 13:51:03.768044 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:35658->10.217.0.165:9311: read: connection reset by peer" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.285882 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317814 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317911 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.317971 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.318074 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.318105 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") pod \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\" (UID: \"d4952a5b-9ca7-4ae1-bcd6-0598511fb809\") " Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.319586 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs" (OuterVolumeSpecName: "logs") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.326630 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9" (OuterVolumeSpecName: "kube-api-access-52gh9") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "kube-api-access-52gh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336134 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" exitCode=0 Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336184 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336211 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" event={"ID":"d4952a5b-9ca7-4ae1-bcd6-0598511fb809","Type":"ContainerDied","Data":"95166cf5a1430ac49b682e214261fcd32c3c05c72a9b931d676bf64e207ffdda"} Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336231 4755 scope.go:117] "RemoveContainer" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.336434 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7f8cb8c64b-l8cp4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.348779 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.373841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.414748 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data" (OuterVolumeSpecName: "config-data") pod "d4952a5b-9ca7-4ae1-bcd6-0598511fb809" (UID: "d4952a5b-9ca7-4ae1-bcd6-0598511fb809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.419996 4755 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420032 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420042 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52gh9\" (UniqueName: \"kubernetes.io/projected/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-kube-api-access-52gh9\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420052 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.420062 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4952a5b-9ca7-4ae1-bcd6-0598511fb809-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.440986 4755 scope.go:117] "RemoveContainer" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.472225 4755 scope.go:117] "RemoveContainer" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: E0320 13:51:04.473233 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": container with ID starting with 58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f not found: ID does not exist" containerID="58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473270 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f"} err="failed to get container status \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": rpc error: code = NotFound desc = could not find container \"58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f\": container with ID starting with 58678e92212d4d898848f42cd1015b0503238e6bd7877fe6ff9eed22dbc5231f not found: ID does not exist" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473299 4755 scope.go:117] "RemoveContainer" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: E0320 13:51:04.473823 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": container with ID starting with 901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4 not found: ID does not exist" containerID="901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.473873 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4"} err="failed to get container status \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": rpc error: code = NotFound desc = could not find container \"901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4\": container with ID starting with 901b8e574c81ae507325135e524cd78cf4a743b0a15abd8ddc7989c7316b75b4 not found: ID does not exist" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.681050 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.691149 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7f8cb8c64b-l8cp4"] Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.749173 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.944261 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:51:04 crc kubenswrapper[4755]: I0320 13:51:04.996267 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-65884d74bb-n9mkw" Mar 20 13:51:05 crc kubenswrapper[4755]: I0320 13:51:05.237985 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" path="/var/lib/kubelet/pods/d4952a5b-9ca7-4ae1-bcd6-0598511fb809/volumes" Mar 20 13:51:06 crc kubenswrapper[4755]: I0320 13:51:06.415366 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.636188 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:07 crc kubenswrapper[4755]: E0320 13:51:07.636959 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.636973 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: E0320 13:51:07.636994 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637001 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637197 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api-log" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.637211 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4952a5b-9ca7-4ae1-bcd6-0598511fb809" containerName="barbican-api" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.638139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.641083 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.641358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.647367 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.673806 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.697870 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.697978 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698015 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698047 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698770 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698889 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698966 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.698996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801049 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801109 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801130 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801150 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801232 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.801875 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-log-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.802175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/12a81787-83e5-4552-85e6-19733309756d-run-httpd\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.808825 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-internal-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.809565 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-public-tls-certs\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.809762 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-config-data\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.812333 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-etc-swift\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.819332 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a81787-83e5-4552-85e6-19733309756d-combined-ca-bundle\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.828139 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dt7\" (UniqueName: \"kubernetes.io/projected/12a81787-83e5-4552-85e6-19733309756d-kube-api-access-b4dt7\") pod \"swift-proxy-847679bbfc-l8kwj\" (UID: \"12a81787-83e5-4552-85e6-19733309756d\") " pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:07 crc kubenswrapper[4755]: I0320 13:51:07.966303 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258105 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258540 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" containerID="cri-o://c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258700 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" containerID="cri-o://6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258860 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" containerID="cri-o://1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.258953 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" containerID="cri-o://14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" gracePeriod=30 Mar 20 13:51:08 crc kubenswrapper[4755]: I0320 13:51:08.292567 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.400820 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401190 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" exitCode=2 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401205 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401215 4755 generic.go:334] "Generic (PLEG): container finished" podID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerID="c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" exitCode=0 Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.400978 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401260 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6"} Mar 20 13:51:09 crc kubenswrapper[4755]: I0320 13:51:09.401296 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f"} Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.387098 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.387626 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" containerID="cri-o://13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" gracePeriod=30 Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.388118 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" containerID="cri-o://44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" gracePeriod=30 Mar 20 13:51:12 crc kubenswrapper[4755]: I0320 13:51:12.661909 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": dial tcp 10.217.0.169:3000: connect: connection refused" Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353318 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353771 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" containerID="cri-o://7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" gracePeriod=30 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.353887 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" containerID="cri-o://22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" gracePeriod=30 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.457573 4755 generic.go:334] "Generic (PLEG): container finished" podID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" exitCode=143 Mar 20 13:51:13 crc kubenswrapper[4755]: I0320 13:51:13.457624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.471618 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" exitCode=143 Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.471987 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.750860 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54fd48b444-c4c9l" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 20 13:51:14 crc kubenswrapper[4755]: I0320 13:51:14.751011 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.222048 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277505 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277532 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277639 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277728 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.277771 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") pod \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\" (UID: \"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe\") " Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.280504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.280632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.282883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts" (OuterVolumeSpecName: "scripts") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.283387 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f" (OuterVolumeSpecName: "kube-api-access-vk59f") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "kube-api-access-vk59f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.305277 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.348964 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.367701 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data" (OuterVolumeSpecName: "config-data") pod "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" (UID: "c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380823 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380871 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380907 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380918 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380927 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380936 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.380947 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk59f\" (UniqueName: \"kubernetes.io/projected/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe-kube-api-access-vk59f\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484389 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe","Type":"ContainerDied","Data":"63a07ea9507732987fa76339a9da53fdf9074739ca064b32427bb55875628827"} Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484435 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.484468 4755 scope.go:117] "RemoveContainer" containerID="14bc281fedc648aa2111a1da4a641fc469fc58529452566b8d129178278e5f63" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.489994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"96136572-ead6-4771-bd36-eec29b5fb137","Type":"ContainerStarted","Data":"8093f8dcdf9807804a717a6ff0bd78208a3f8da65f0e0452029772bc9aba27d1"} Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.501395 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-847679bbfc-l8kwj"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.506841 4755 scope.go:117] "RemoveContainer" containerID="6cdb8f12ea733e2a1062d6144c8b8dec4b3814bba46c3e3349586e3f17230006" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.538910 4755 scope.go:117] "RemoveContainer" containerID="1c37112d0c1ff3c6d6454da40662a873c2fd167070a38bfd5711fecb777438a6" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.549331 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.455528611 podStartE2EDuration="15.549309822s" podCreationTimestamp="2026-03-20 13:51:00 +0000 UTC" firstStartedPulling="2026-03-20 13:51:01.901499892 +0000 UTC m=+1241.499432421" lastFinishedPulling="2026-03-20 13:51:14.995281103 +0000 UTC m=+1254.593213632" observedRunningTime="2026-03-20 13:51:15.525031277 +0000 UTC m=+1255.122963806" watchObservedRunningTime="2026-03-20 13:51:15.549309822 +0000 UTC m=+1255.147242361" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.562198 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.566450 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:59152->10.217.0.156:9292: read: connection reset by peer" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.566706 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": read tcp 10.217.0.2:59154->10.217.0.156:9292: read: connection reset by peer" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.580842 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.592059 4755 scope.go:117] "RemoveContainer" containerID="c48dac781e5b4104ebe6a032749f8a05fbae8019f0a25fefe10e85e87b80351f" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594057 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594522 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594543 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594557 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594565 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594588 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594594 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: E0320 13:51:15.594615 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594622 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594846 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="sg-core" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594878 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-central-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594890 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="ceilometer-notification-agent" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.594899 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" containerName="proxy-httpd" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.596663 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.602011 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.610338 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.610670 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687496 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687548 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687572 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687615 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.687669 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.688068 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.688312 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790163 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790265 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790348 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790379 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790403 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.790506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.791804 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.798117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.805318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.805964 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.806086 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.806142 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.815454 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"ceilometer-0\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " pod="openstack/ceilometer-0" Mar 20 13:51:15 crc kubenswrapper[4755]: I0320 13:51:15.929103 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.105770 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199559 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199817 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199862 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199899 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.199950 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200093 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.200207 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") pod \"d489e08f-1107-45f2-b1d0-c9b786974ee4\" (UID: \"d489e08f-1107-45f2-b1d0-c9b786974ee4\") " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.205306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs" (OuterVolumeSpecName: "logs") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.211416 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.215816 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts" (OuterVolumeSpecName: "scripts") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.218016 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw" (OuterVolumeSpecName: "kube-api-access-l4bvw") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "kube-api-access-l4bvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.228872 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.274855 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303360 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303399 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303409 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d489e08f-1107-45f2-b1d0-c9b786974ee4-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303435 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303445 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4bvw\" (UniqueName: \"kubernetes.io/projected/d489e08f-1107-45f2-b1d0-c9b786974ee4-kube-api-access-l4bvw\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.303455 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.355344 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.358837 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.378444 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data" (OuterVolumeSpecName: "config-data") pod "d489e08f-1107-45f2-b1d0-c9b786974ee4" (UID: "d489e08f-1107-45f2-b1d0-c9b786974ee4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413537 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413602 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.413620 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d489e08f-1107-45f2-b1d0-c9b786974ee4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.520573 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:46726->10.217.0.155:9292: read: connection reset by peer" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.520993 4755 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": read tcp 10.217.0.2:46736->10.217.0.155:9292: read: connection reset by peer" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545002 4755 generic.go:334] "Generic (PLEG): container finished" podID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" exitCode=0 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545124 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545177 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d489e08f-1107-45f2-b1d0-c9b786974ee4","Type":"ContainerDied","Data":"2a8c9089e1e7047efefde5df6ddda04cf4071fad9b8adaef75c6dc167f6e724d"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545196 4755 scope.go:117] "RemoveContainer" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.545327 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.570857 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"513c4483cc521844fc779ad4ad8fea88a8f59c0d75bd000160932fcd3bb65cd0"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.570928 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"492850182a806432985c0a76db5d357fdd1f75f64e462ac1c324547eddbc580a"} Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.660725 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.718632 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.723814 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-754b98cbff-jgntp" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.727572 4755 scope.go:117] "RemoveContainer" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.784704 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.811285 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.811679 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.820559 4755 scope.go:117] "RemoveContainer" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.811692 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.822070 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822091 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.822559 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": container with ID starting with 44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7 not found: ID does not exist" containerID="44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822592 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-httpd" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822596 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7"} err="failed to get container status \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": rpc error: code = NotFound desc = could not find container \"44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7\": container with ID starting with 44493bc8379c24a23c32f1c18af13c869f56b53b5e47a35d3d92bb99a5bf63f7 not found: ID does not exist" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822624 4755 scope.go:117] "RemoveContainer" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.822636 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" containerName="glance-log" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.823634 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: E0320 13:51:16.823692 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": container with ID starting with 13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b not found: ID does not exist" containerID="13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.823711 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b"} err="failed to get container status \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": rpc error: code = NotFound desc = could not find container \"13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b\": container with ID starting with 13610e3e989ee906b7fb560daf7a34def133968c9fc77b1c7649f1ce24179a2b not found: ID does not exist" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.827583 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.830370 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.836032 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.893752 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.894173 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cf99699dd-lg99t" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" containerID="cri-o://7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" gracePeriod=30 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.894854 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7cf99699dd-lg99t" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" containerID="cri-o://cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" gracePeriod=30 Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956204 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956243 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956289 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956323 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956364 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956404 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956420 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:16 crc kubenswrapper[4755]: I0320 13:51:16.956444 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058273 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058371 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058415 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058458 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058500 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058547 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.058604 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059032 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-logs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059247 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.059838 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e65d1645-8a19-459e-ac89-b485f27e2841-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.067152 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-config-data\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.070947 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.073435 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-scripts\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.091027 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj222\" (UniqueName: \"kubernetes.io/projected/e65d1645-8a19-459e-ac89-b485f27e2841-kube-api-access-lj222\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.093318 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e65d1645-8a19-459e-ac89-b485f27e2841-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.100485 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-external-api-0\" (UID: \"e65d1645-8a19-459e-ac89-b485f27e2841\") " pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.147285 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.257553 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe" path="/var/lib/kubelet/pods/c41bd0a8-9b4e-4a16-ac21-e4d25ba89fbe/volumes" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.258744 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d489e08f-1107-45f2-b1d0-c9b786974ee4" path="/var/lib/kubelet/pods/d489e08f-1107-45f2-b1d0-c9b786974ee4/volumes" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.536992 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589500 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589570 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589611 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589715 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589740 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.589835 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") pod \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\" (UID: \"74e04f8c-57a9-4c29-b9ae-5fea257f36da\") " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.592120 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.596240 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs" (OuterVolumeSpecName: "logs") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.607144 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r" (OuterVolumeSpecName: "kube-api-access-2ng4r") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "kube-api-access-2ng4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.609189 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.613541 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts" (OuterVolumeSpecName: "scripts") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.615041 4755 generic.go:334] "Generic (PLEG): container finished" podID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerID="cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" exitCode=0 Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.615142 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.643865 4755 generic.go:334] "Generic (PLEG): container finished" podID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" exitCode=0 Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.643961 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"74e04f8c-57a9-4c29-b9ae-5fea257f36da","Type":"ContainerDied","Data":"e2bc4a76cdde92278100366b804ccffd9de5944afb2aabb9539badf236736c8a"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644023 4755 scope.go:117] "RemoveContainer" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.644187 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703066 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-847679bbfc-l8kwj" event={"ID":"12a81787-83e5-4552-85e6-19733309756d","Type":"ContainerStarted","Data":"063ac1b6b9806b98d914b194e9bde9a87a0a8f7cc8c074cb5f40a75a8fe6d0e5"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703083 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ng4r\" (UniqueName: \"kubernetes.io/projected/74e04f8c-57a9-4c29-b9ae-5fea257f36da-kube-api-access-2ng4r\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703157 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703172 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703187 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/74e04f8c-57a9-4c29-b9ae-5fea257f36da-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.703217 4755 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.712950 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.713290 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.721825 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"e82f1532e1a1c38107bb859460f4520da9baccf34fa7c549aea69d074c192f66"} Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.759950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.784798 4755 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.786032 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-847679bbfc-l8kwj" podStartSLOduration=10.786002963 podStartE2EDuration="10.786002963s" podCreationTimestamp="2026-03-20 13:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:17.77307318 +0000 UTC m=+1257.371005709" watchObservedRunningTime="2026-03-20 13:51:17.786002963 +0000 UTC m=+1257.383935492" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.809984 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.810027 4755 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.814550 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.859248 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data" (OuterVolumeSpecName: "config-data") pod "74e04f8c-57a9-4c29-b9ae-5fea257f36da" (UID: "74e04f8c-57a9-4c29-b9ae-5fea257f36da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.912787 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.912832 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/74e04f8c-57a9-4c29-b9ae-5fea257f36da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.933818 4755 scope.go:117] "RemoveContainer" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:17 crc kubenswrapper[4755]: I0320 13:51:17.997990 4755 scope.go:117] "RemoveContainer" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.000958 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": container with ID starting with 22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3 not found: ID does not exist" containerID="22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.000988 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3"} err="failed to get container status \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": rpc error: code = NotFound desc = could not find container \"22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3\": container with ID starting with 22ea9bb1439225fa171c5c43bab98e1a15231a0fd45286babcf5c7b5422ed2e3 not found: ID does not exist" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.001008 4755 scope.go:117] "RemoveContainer" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.001243 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": container with ID starting with 7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589 not found: ID does not exist" containerID="7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.001260 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589"} err="failed to get container status \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": rpc error: code = NotFound desc = could not find container \"7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589\": container with ID starting with 7eed8133cb348a332e93246dbd6126d4bbf251bd33d3c79007e9210f52356589 not found: ID does not exist" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.002599 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.024911 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.039946 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.040409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040422 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: E0320 13:51:18.040452 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040458 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040610 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-log" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.040634 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" containerName="glance-httpd" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.041490 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.047094 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.047248 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.072536 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.104261 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120487 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120839 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.120947 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121067 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121178 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121304 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121412 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.121544 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: W0320 13:51:18.139099 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode65d1645_8a19_459e_ac89_b485f27e2841.slice/crio-c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf WatchSource:0}: Error finding container c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf: Status 404 returned error can't find the container with id c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223153 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223207 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223230 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223246 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223281 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223308 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.223436 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.224288 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-logs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.224790 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b182ae3-20c9-48af-9313-d48a608924b1-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.225037 4755 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.233397 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234117 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.234456 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b182ae3-20c9-48af-9313-d48a608924b1-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.255372 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vbj\" (UniqueName: \"kubernetes.io/projected/6b182ae3-20c9-48af-9313-d48a608924b1-kube-api-access-w7vbj\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.273871 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-internal-api-0\" (UID: \"6b182ae3-20c9-48af-9313-d48a608924b1\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.365000 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.499850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.825798 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} Mar 20 13:51:18 crc kubenswrapper[4755]: I0320 13:51:18.841739 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"c047a2904ffd52950e4234e5898c060518c20a171e9be6fe39bc6506143f28bf"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.187531 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:51:19 crc kubenswrapper[4755]: W0320 13:51:19.198089 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b182ae3_20c9_48af_9313_d48a608924b1.slice/crio-b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303 WatchSource:0}: Error finding container b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303: Status 404 returned error can't find the container with id b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303 Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.237448 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e04f8c-57a9-4c29-b9ae-5fea257f36da" path="/var/lib/kubelet/pods/74e04f8c-57a9-4c29-b9ae-5fea257f36da/volumes" Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.874628 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.875670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.900141 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"0687b0972ba88ee0ba6ba105a270c9eafaab3c519d6923105497f9948a2f55ec"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.927548 4755 generic.go:334] "Generic (PLEG): container finished" podID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerID="7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" exitCode=0 Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.927634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1"} Mar 20 13:51:19 crc kubenswrapper[4755]: I0320 13:51:19.929887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"b5516f393e37164f8e3101af7e1580dbfb3d1368a93273a957ff4b188a24d303"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:19.999994 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159286 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159761 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.159982 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.160211 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.160450 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") pod \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\" (UID: \"bb70d5b8-33a3-4299-bae5-d13d998e11a2\") " Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.185750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh" (OuterVolumeSpecName: "kube-api-access-grrbh") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "kube-api-access-grrbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.186835 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.262330 4755 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.262400 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grrbh\" (UniqueName: \"kubernetes.io/projected/bb70d5b8-33a3-4299-bae5-d13d998e11a2-kube-api-access-grrbh\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.264563 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.277644 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config" (OuterVolumeSpecName: "config") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.317735 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "bb70d5b8-33a3-4299-bae5-d13d998e11a2" (UID: "bb70d5b8-33a3-4299-bae5-d13d998e11a2"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364371 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364404 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.364414 4755 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/bb70d5b8-33a3-4299-bae5-d13d998e11a2-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941390 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7cf99699dd-lg99t" event={"ID":"bb70d5b8-33a3-4299-bae5-d13d998e11a2","Type":"ContainerDied","Data":"c78877e8c0818f15b0f2e1b6adc8bc55f8f067e988d6b39df713c8a50ee71484"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941735 4755 scope.go:117] "RemoveContainer" containerID="cc3d6ad570ae6efe2febe508a4ae6a81c1e81d05b213aff57419ed9d5e14201c" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.941473 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7cf99699dd-lg99t" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.945419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"651a569f8e307be4d0515c4e75ec2a428af80a6c718afc1987e63aae8dae8f60"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.945469 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6b182ae3-20c9-48af-9313-d48a608924b1","Type":"ContainerStarted","Data":"2fbb124138d8dd562d0f2c1fd811d9014a8cae04e612041f70cd59d89cbe840e"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.951341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e65d1645-8a19-459e-ac89-b485f27e2841","Type":"ContainerStarted","Data":"16bfd23b519b858985c83d29ed9143cc5646aede3b90145aebf288ed681971db"} Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.966259 4755 scope.go:117] "RemoveContainer" containerID="7fa0e70fddfd5391932714762c08ddfc9f45bb54801dac970397d34a46d312e1" Mar 20 13:51:20 crc kubenswrapper[4755]: I0320 13:51:20.974253 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.97423423 podStartE2EDuration="3.97423423s" podCreationTimestamp="2026-03-20 13:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:20.968735948 +0000 UTC m=+1260.566668477" watchObservedRunningTime="2026-03-20 13:51:20.97423423 +0000 UTC m=+1260.572166759" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.010368 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.024766 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7cf99699dd-lg99t"] Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.030056 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.030034318 podStartE2EDuration="5.030034318s" podCreationTimestamp="2026-03-20 13:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:21.01344623 +0000 UTC m=+1260.611378759" watchObservedRunningTime="2026-03-20 13:51:21.030034318 +0000 UTC m=+1260.627966847" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.251211 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" path="/var/lib/kubelet/pods/bb70d5b8-33a3-4299-bae5-d13d998e11a2/volumes" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.417688 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491713 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491792 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.491965 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492037 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492668 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492717 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.492779 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") pod \"12871c7a-ef63-447d-b1f6-27a5645dbc21\" (UID: \"12871c7a-ef63-447d-b1f6-27a5645dbc21\") " Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.493179 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs" (OuterVolumeSpecName: "logs") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.493408 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12871c7a-ef63-447d-b1f6-27a5645dbc21-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.505804 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.509249 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds" (OuterVolumeSpecName: "kube-api-access-525ds") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "kube-api-access-525ds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.533403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data" (OuterVolumeSpecName: "config-data") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.538866 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.559879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts" (OuterVolumeSpecName: "scripts") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.583081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "12871c7a-ef63-447d-b1f6-27a5645dbc21" (UID: "12871c7a-ef63-447d-b1f6-27a5645dbc21"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595623 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595672 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595682 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-525ds\" (UniqueName: \"kubernetes.io/projected/12871c7a-ef63-447d-b1f6-27a5645dbc21-kube-api-access-525ds\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595698 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595709 4755 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/12871c7a-ef63-447d-b1f6-27a5645dbc21-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.595718 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/12871c7a-ef63-447d-b1f6-27a5645dbc21-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962160 4755 generic.go:334] "Generic (PLEG): container finished" podID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" exitCode=137 Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54fd48b444-c4c9l" Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962225 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54fd48b444-c4c9l" event={"ID":"12871c7a-ef63-447d-b1f6-27a5645dbc21","Type":"ContainerDied","Data":"fd011c06f3fffccd2ebc454db1a10f42c4b31b9cc3cdee3a458a0730af40410b"} Mar 20 13:51:21 crc kubenswrapper[4755]: I0320 13:51:21.962339 4755 scope.go:117] "RemoveContainer" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.007862 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.014947 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54fd48b444-c4c9l"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.158406 4755 scope.go:117] "RemoveContainer" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.181561 4755 scope.go:117] "RemoveContainer" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.182604 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": container with ID starting with defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c not found: ID does not exist" containerID="defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.182666 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c"} err="failed to get container status \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": rpc error: code = NotFound desc = could not find container \"defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c\": container with ID starting with defdbd916f39e411f2e9c155c9e19c48a52b9f8797ab38bad462853389aaac2c not found: ID does not exist" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.182698 4755 scope.go:117] "RemoveContainer" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.183139 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": container with ID starting with 50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8 not found: ID does not exist" containerID="50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.183164 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8"} err="failed to get container status \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": rpc error: code = NotFound desc = could not find container \"50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8\": container with ID starting with 50e9f3f3cea809fd6178186d31762a8da9931ec27f0f0a85458f74e1226783f8 not found: ID does not exist" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.888678 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889131 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889151 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889170 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889179 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889198 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889206 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: E0320 13:51:22.889220 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889229 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889461 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon-log" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889492 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" containerName="horizon" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889505 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-httpd" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.889516 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb70d5b8-33a3-4299-bae5-d13d998e11a2" containerName="neutron-api" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.890823 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.906579 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.921675 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.921794 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.984940 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerStarted","Data":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985146 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" containerID="cri-o://3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985398 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985680 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" containerID="cri-o://c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985725 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" containerID="cri-o://d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" gracePeriod=30 Mar 20 13:51:22 crc kubenswrapper[4755]: I0320 13:51:22.985762 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" containerID="cri-o://028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" gracePeriod=30 Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.001743 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.003412 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.004618 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.026463 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.026573 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027494 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027502 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.517847701 podStartE2EDuration="8.027483802s" podCreationTimestamp="2026-03-20 13:51:15 +0000 UTC" firstStartedPulling="2026-03-20 13:51:16.727566342 +0000 UTC m=+1256.325498861" lastFinishedPulling="2026-03-20 13:51:22.237202433 +0000 UTC m=+1261.835134962" observedRunningTime="2026-03-20 13:51:23.011211652 +0000 UTC m=+1262.609144181" watchObservedRunningTime="2026-03-20 13:51:23.027483802 +0000 UTC m=+1262.625416331" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.027828 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.033535 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-847679bbfc-l8kwj" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.074017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"nova-api-db-create-9jv87\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.138769 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.139051 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.211462 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.241255 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.241378 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.242185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.265941 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12871c7a-ef63-447d-b1f6-27a5645dbc21" path="/var/lib/kubelet/pods/12871c7a-ef63-447d-b1f6-27a5645dbc21/volumes" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.268103 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.270014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.273903 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"nova-cell0-db-create-79jc8\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.283885 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.286053 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.293454 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.315310 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.329274 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.436377 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.438204 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.442822 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.464172 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.465898 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.465997 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.466024 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.466128 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.469486 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573339 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573556 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573713 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.573763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.575542 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.576822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.601513 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"nova-api-c99c-account-create-update-5s889\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.630396 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"nova-cell1-db-create-jqk4f\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.658711 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.659098 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.659973 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.673025 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.674761 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.674798 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.675779 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.709889 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.728241 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"nova-cell0-4e76-account-create-update-vjcr6\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.759139 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.789251 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.789311 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.799919 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.900706 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.900772 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.902328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.931263 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"nova-cell1-ee84-account-create-update-jpmvf\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:23 crc kubenswrapper[4755]: I0320 13:51:23.932944 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.018950 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.070926 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerStarted","Data":"80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091931 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" exitCode=0 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091974 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" exitCode=2 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.091983 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" exitCode=0 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093265 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093301 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.093316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.266473 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.478118 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.553876 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.705641 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 13:51:24 crc kubenswrapper[4755]: W0320 13:51:24.715563 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03accbff_bdf2_4256_bdf2_1b39d5485673.slice/crio-6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622 WatchSource:0}: Error finding container 6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622: Status 404 returned error can't find the container with id 6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622 Mar 20 13:51:24 crc kubenswrapper[4755]: I0320 13:51:24.846494 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 13:51:24 crc kubenswrapper[4755]: W0320 13:51:24.929783 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39991203_9b8d_4985_8e90_b3d1772f6b8f.slice/crio-108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81 WatchSource:0}: Error finding container 108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81: Status 404 returned error can't find the container with id 108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114310 4755 generic.go:334] "Generic (PLEG): container finished" podID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerID="f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba" exitCode=0 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114484 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerDied","Data":"f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.114538 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerStarted","Data":"658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.123930 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerStarted","Data":"08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.123998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerStarted","Data":"6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.127585 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerStarted","Data":"108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.136429 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerStarted","Data":"d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.136518 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerStarted","Data":"b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.177997 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerStarted","Data":"8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.178226 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerStarted","Data":"a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.189089 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-c99c-account-create-update-5s889" podStartSLOduration=2.189052167 podStartE2EDuration="2.189052167s" podCreationTimestamp="2026-03-20 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:25.165109579 +0000 UTC m=+1264.763042108" watchObservedRunningTime="2026-03-20 13:51:25.189052167 +0000 UTC m=+1264.786984686" Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.194063 4755 generic.go:334] "Generic (PLEG): container finished" podID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerID="34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27" exitCode=0 Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.194741 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerDied","Data":"34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27"} Mar 20 13:51:25 crc kubenswrapper[4755]: I0320 13:51:25.219917 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" podStartSLOduration=2.219896822 podStartE2EDuration="2.219896822s" podCreationTimestamp="2026-03-20 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:25.213141197 +0000 UTC m=+1264.811073716" watchObservedRunningTime="2026-03-20 13:51:25.219896822 +0000 UTC m=+1264.817829351" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.205582 4755 generic.go:334] "Generic (PLEG): container finished" podID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerID="08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.206037 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerDied","Data":"08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.209075 4755 generic.go:334] "Generic (PLEG): container finished" podID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerID="a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.209190 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerDied","Data":"a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.211048 4755 generic.go:334] "Generic (PLEG): container finished" podID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerID="d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.211087 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerDied","Data":"d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7"} Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.212606 4755 generic.go:334] "Generic (PLEG): container finished" podID="f395acec-f28b-4622-b349-127cf31ec92d" containerID="8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.212790 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerDied","Data":"8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363"} Mar 20 13:51:26 crc kubenswrapper[4755]: E0320 13:51:26.342687 4755 kubelet_node_status.go:756] "Failed to set some node status fields" err="failed to validate nodeIP: route ip+net: no such network interface" node="crc" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.588340 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.681796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") pod \"f395acec-f28b-4622-b349-127cf31ec92d\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.681857 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") pod \"f395acec-f28b-4622-b349-127cf31ec92d\" (UID: \"f395acec-f28b-4622-b349-127cf31ec92d\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.682742 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f395acec-f28b-4622-b349-127cf31ec92d" (UID: "f395acec-f28b-4622-b349-127cf31ec92d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.688898 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k" (OuterVolumeSpecName: "kube-api-access-rlf4k") pod "f395acec-f28b-4622-b349-127cf31ec92d" (UID: "f395acec-f28b-4622-b349-127cf31ec92d"). InnerVolumeSpecName "kube-api-access-rlf4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.784391 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f395acec-f28b-4622-b349-127cf31ec92d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.784791 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlf4k\" (UniqueName: \"kubernetes.io/projected/f395acec-f28b-4622-b349-127cf31ec92d-kube-api-access-rlf4k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.888438 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.900964 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.987556 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") pod \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.987691 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") pod \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\" (UID: \"0deb3f1a-0cad-4429-9e79-38e5a0b38896\") " Mar 20 13:51:26 crc kubenswrapper[4755]: I0320 13:51:26.989456 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0deb3f1a-0cad-4429-9e79-38e5a0b38896" (UID: "0deb3f1a-0cad-4429-9e79-38e5a0b38896"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.003846 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b" (OuterVolumeSpecName: "kube-api-access-7dw7b") pod "0deb3f1a-0cad-4429-9e79-38e5a0b38896" (UID: "0deb3f1a-0cad-4429-9e79-38e5a0b38896"). InnerVolumeSpecName "kube-api-access-7dw7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089251 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") pod \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") pod \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\" (UID: \"32a5606c-c777-4c0b-951c-6ce2e03edd7e\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.089793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "32a5606c-c777-4c0b-951c-6ce2e03edd7e" (UID: "32a5606c-c777-4c0b-951c-6ce2e03edd7e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090128 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0deb3f1a-0cad-4429-9e79-38e5a0b38896-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090148 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/32a5606c-c777-4c0b-951c-6ce2e03edd7e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.090161 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dw7b\" (UniqueName: \"kubernetes.io/projected/0deb3f1a-0cad-4429-9e79-38e5a0b38896-kube-api-access-7dw7b\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.093844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4" (OuterVolumeSpecName: "kube-api-access-nwfb4") pod "32a5606c-c777-4c0b-951c-6ce2e03edd7e" (UID: "32a5606c-c777-4c0b-951c-6ce2e03edd7e"). InnerVolumeSpecName "kube-api-access-nwfb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.150017 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.150272 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.191876 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwfb4\" (UniqueName: \"kubernetes.io/projected/32a5606c-c777-4c0b-951c-6ce2e03edd7e-kube-api-access-nwfb4\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.205687 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.210854 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.254981 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-9jv87" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.255001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-9jv87" event={"ID":"0deb3f1a-0cad-4429-9e79-38e5a0b38896","Type":"ContainerDied","Data":"80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.255080 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80ebb90f8a0e8342a2a656ff82e63acee227c822845c1ae672984d38ad096289" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265050 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jqk4f" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jqk4f" event={"ID":"f395acec-f28b-4622-b349-127cf31ec92d","Type":"ContainerDied","Data":"a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.265152 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1978097078625e11f33e5697035e2510a4058e810ddcfdafcf23db8fd892acb" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270287 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-79jc8" event={"ID":"32a5606c-c777-4c0b-951c-6ce2e03edd7e","Type":"ContainerDied","Data":"658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9"} Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270320 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="658bb29da666f3b8bb21f1b657a84dae77590d1b6fa2ef9da888d585f6571fb9" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.270417 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-79jc8" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.272511 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.272555 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.716869 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.801648 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") pod \"39991203-9b8d-4985-8e90-b3d1772f6b8f\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.801839 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") pod \"39991203-9b8d-4985-8e90-b3d1772f6b8f\" (UID: \"39991203-9b8d-4985-8e90-b3d1772f6b8f\") " Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.802194 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "39991203-9b8d-4985-8e90-b3d1772f6b8f" (UID: "39991203-9b8d-4985-8e90-b3d1772f6b8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.802342 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/39991203-9b8d-4985-8e90-b3d1772f6b8f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.806882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7" (OuterVolumeSpecName: "kube-api-access-gzct7") pod "39991203-9b8d-4985-8e90-b3d1772f6b8f" (UID: "39991203-9b8d-4985-8e90-b3d1772f6b8f"). InnerVolumeSpecName "kube-api-access-gzct7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.847374 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.861241 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:27 crc kubenswrapper[4755]: I0320 13:51:27.903825 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzct7\" (UniqueName: \"kubernetes.io/projected/39991203-9b8d-4985-8e90-b3d1772f6b8f-kube-api-access-gzct7\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008137 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") pod \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008231 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") pod \"03accbff-bdf2-4256-bdf2-1b39d5485673\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008341 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") pod \"03accbff-bdf2-4256-bdf2-1b39d5485673\" (UID: \"03accbff-bdf2-4256-bdf2-1b39d5485673\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008435 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") pod \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\" (UID: \"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86\") " Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008561 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" (UID: "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008761 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "03accbff-bdf2-4256-bdf2-1b39d5485673" (UID: "03accbff-bdf2-4256-bdf2-1b39d5485673"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.008831 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.012396 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6" (OuterVolumeSpecName: "kube-api-access-mhdb6") pod "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" (UID: "523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86"). InnerVolumeSpecName "kube-api-access-mhdb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.013670 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n" (OuterVolumeSpecName: "kube-api-access-bqb6n") pod "03accbff-bdf2-4256-bdf2-1b39d5485673" (UID: "03accbff-bdf2-4256-bdf2-1b39d5485673"). InnerVolumeSpecName "kube-api-access-bqb6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110301 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqb6n\" (UniqueName: \"kubernetes.io/projected/03accbff-bdf2-4256-bdf2-1b39d5485673-kube-api-access-bqb6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110342 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhdb6\" (UniqueName: \"kubernetes.io/projected/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86-kube-api-access-mhdb6\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.110352 4755 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/03accbff-bdf2-4256-bdf2-1b39d5485673-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282052 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-c99c-account-create-update-5s889" event={"ID":"03accbff-bdf2-4256-bdf2-1b39d5485673","Type":"ContainerDied","Data":"6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282103 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fc43036a610a48be61d982e54a8adf1a8274d20fbcd3b174bdb2736ef76d622" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.282103 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-c99c-account-create-update-5s889" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283626 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" event={"ID":"39991203-9b8d-4985-8e90-b3d1772f6b8f","Type":"ContainerDied","Data":"108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283667 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-ee84-account-create-update-jpmvf" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.283672 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="108c1b83f128a08c09ef68778cad1ae2cb6fda2ce7f1afd9032d3273b9971f81" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285496 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285480 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-4e76-account-create-update-vjcr6" event={"ID":"523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86","Type":"ContainerDied","Data":"b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f"} Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.285621 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1827ad7ea66fc819f6e50f37833df0fe4c09c20fb51b9c4925edd1d874d847f" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.365877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.365924 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.406444 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:28 crc kubenswrapper[4755]: I0320 13:51:28.418686 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.226195 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301483 4755 generic.go:334] "Generic (PLEG): container finished" podID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" exitCode=0 Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301575 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301583 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.301626 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302535 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"93514dc3-0a66-4347-9dba-f787f875cd5c","Type":"ContainerDied","Data":"e82f1532e1a1c38107bb859460f4520da9baccf34fa7c549aea69d074c192f66"} Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.302555 4755 scope.go:117] "RemoveContainer" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.304001 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.304526 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.334129 4755 scope.go:117] "RemoveContainer" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.363685 4755 scope.go:117] "RemoveContainer" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.394849 4755 scope.go:117] "RemoveContainer" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.420048 4755 scope.go:117] "RemoveContainer" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.421091 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": container with ID starting with c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b not found: ID does not exist" containerID="c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421140 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b"} err="failed to get container status \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": rpc error: code = NotFound desc = could not find container \"c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b\": container with ID starting with c9890f968a06ee6b63be1511df5efa79e6acfc084e750ada054bb4a5cc75ec5b not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421176 4755 scope.go:117] "RemoveContainer" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.421815 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": container with ID starting with d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e not found: ID does not exist" containerID="d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421882 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e"} err="failed to get container status \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": rpc error: code = NotFound desc = could not find container \"d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e\": container with ID starting with d6d9e00c676428cd82dce5cb47bfc7b668166b46eb5559c8000c07f6bfbe895e not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.421918 4755 scope.go:117] "RemoveContainer" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.422231 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": container with ID starting with 028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7 not found: ID does not exist" containerID="028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422255 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7"} err="failed to get container status \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": rpc error: code = NotFound desc = could not find container \"028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7\": container with ID starting with 028751d537c0f9cc5a9ba0f001c6bbde32be734f95a261d6869681bb4e8258c7 not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422269 4755 scope.go:117] "RemoveContainer" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.422634 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": container with ID starting with 3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5 not found: ID does not exist" containerID="3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.422709 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5"} err="failed to get container status \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": rpc error: code = NotFound desc = could not find container \"3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5\": container with ID starting with 3ec3ed33d023ef764e7813435ed6124d4e8fb75e53858c95d83579442df203c5 not found: ID does not exist" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.436888 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437008 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437038 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437080 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437173 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437230 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437380 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") pod \"93514dc3-0a66-4347-9dba-f787f875cd5c\" (UID: \"93514dc3-0a66-4347-9dba-f787f875cd5c\") " Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437650 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.437932 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.438217 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.445461 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts" (OuterVolumeSpecName: "scripts") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.455761 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.458242 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9" (OuterVolumeSpecName: "kube-api-access-ctmh9") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "kube-api-access-ctmh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.469499 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.476562 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539299 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctmh9\" (UniqueName: \"kubernetes.io/projected/93514dc3-0a66-4347-9dba-f787f875cd5c-kube-api-access-ctmh9\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539329 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539338 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.539349 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/93514dc3-0a66-4347-9dba-f787f875cd5c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.550537 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.611619 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data" (OuterVolumeSpecName: "config-data") pod "93514dc3-0a66-4347-9dba-f787f875cd5c" (UID: "93514dc3-0a66-4347-9dba-f787f875cd5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.640831 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.640870 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93514dc3-0a66-4347-9dba-f787f875cd5c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.934853 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.949112 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.978031 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.978448 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.978464 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983851 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983871 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983889 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983897 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983910 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983918 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983936 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983944 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983958 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983965 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.983984 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.983993 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984005 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984012 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984028 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984036 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: E0320 13:51:29.984053 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984060 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984429 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f395acec-f28b-4622-b349-127cf31ec92d" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984450 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984464 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984477 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984492 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-central-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984501 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="ceilometer-notification-agent" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" containerName="mariadb-account-create-update" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984529 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" containerName="mariadb-database-create" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984539 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="proxy-httpd" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.984554 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" containerName="sg-core" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.986508 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.986640 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.989243 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:29 crc kubenswrapper[4755]: I0320 13:51:29.989762 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150746 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150791 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150819 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.150943 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.151005 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.151028 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253508 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253554 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253597 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253681 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253720 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253776 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.253805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.254299 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.255265 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.258553 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.258679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.259100 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.271846 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.275907 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"ceilometer-0\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.306798 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:30 crc kubenswrapper[4755]: W0320 13:51:30.967458 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69454aac_1cd3_4905_84a8_9798dce108a6.slice/crio-bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf WatchSource:0}: Error finding container bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf: Status 404 returned error can't find the container with id bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf Mar 20 13:51:30 crc kubenswrapper[4755]: I0320 13:51:30.982145 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.237929 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93514dc3-0a66-4347-9dba-f787f875cd5c" path="/var/lib/kubelet/pods/93514dc3-0a66-4347-9dba-f787f875cd5c/volumes" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358024 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf"} Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358106 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.358119 4755 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.403476 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:31 crc kubenswrapper[4755]: I0320 13:51:31.405898 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:51:32 crc kubenswrapper[4755]: I0320 13:51:32.366953 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.377700 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.917065 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.918733 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921484 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xvvcw" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921528 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.921846 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:51:33 crc kubenswrapper[4755]: I0320 13:51:33.929793 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038079 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038575 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.038716 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139750 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139784 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139875 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.139935 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.146748 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.149070 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.152588 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.167548 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"nova-cell0-conductor-db-sync-mbd9g\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.360310 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.390711 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.909213 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 13:51:34 crc kubenswrapper[4755]: W0320 13:51:34.914105 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfaef786e_b221_4fff_8d48_42b8163ed86a.slice/crio-468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467 WatchSource:0}: Error finding container 468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467: Status 404 returned error can't find the container with id 468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467 Mar 20 13:51:34 crc kubenswrapper[4755]: I0320 13:51:34.940552 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:35 crc kubenswrapper[4755]: I0320 13:51:35.400892 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerStarted","Data":"468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467"} Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.434433 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerStarted","Data":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435186 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" containerID="cri-o://bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435548 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435862 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" containerID="cri-o://89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435906 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" containerID="cri-o://c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.435935 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" containerID="cri-o://65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" gracePeriod=30 Mar 20 13:51:37 crc kubenswrapper[4755]: I0320 13:51:37.471247 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.2766984089999998 podStartE2EDuration="8.471224959s" podCreationTimestamp="2026-03-20 13:51:29 +0000 UTC" firstStartedPulling="2026-03-20 13:51:30.970731708 +0000 UTC m=+1270.568664237" lastFinishedPulling="2026-03-20 13:51:36.165258258 +0000 UTC m=+1275.763190787" observedRunningTime="2026-03-20 13:51:37.462120725 +0000 UTC m=+1277.060053254" watchObservedRunningTime="2026-03-20 13:51:37.471224959 +0000 UTC m=+1277.069157488" Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.446934 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" exitCode=0 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447253 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" exitCode=2 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447265 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" exitCode=0 Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447285 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447314 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} Mar 20 13:51:38 crc kubenswrapper[4755]: I0320 13:51:38.447323 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} Mar 20 13:51:44 crc kubenswrapper[4755]: I0320 13:51:44.505708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerStarted","Data":"109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28"} Mar 20 13:51:44 crc kubenswrapper[4755]: I0320 13:51:44.531764 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" podStartSLOduration=2.555260545 podStartE2EDuration="11.531739763s" podCreationTimestamp="2026-03-20 13:51:33 +0000 UTC" firstStartedPulling="2026-03-20 13:51:34.916796229 +0000 UTC m=+1274.514728758" lastFinishedPulling="2026-03-20 13:51:43.893275457 +0000 UTC m=+1283.491207976" observedRunningTime="2026-03-20 13:51:44.527621198 +0000 UTC m=+1284.125553737" watchObservedRunningTime="2026-03-20 13:51:44.531739763 +0000 UTC m=+1284.129672312" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.165337 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275249 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275352 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275425 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275468 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275536 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275596 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275642 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") pod \"69454aac-1cd3-4905-84a8-9798dce108a6\" (UID: \"69454aac-1cd3-4905-84a8-9798dce108a6\") " Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.275841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.276413 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.276626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.282950 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k" (OuterVolumeSpecName: "kube-api-access-62f6k") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "kube-api-access-62f6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.283745 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts" (OuterVolumeSpecName: "scripts") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.326298 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379084 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62f6k\" (UniqueName: \"kubernetes.io/projected/69454aac-1cd3-4905-84a8-9798dce108a6-kube-api-access-62f6k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379128 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379140 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379151 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/69454aac-1cd3-4905-84a8-9798dce108a6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.379433 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.414873 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data" (OuterVolumeSpecName: "config-data") pod "69454aac-1cd3-4905-84a8-9798dce108a6" (UID: "69454aac-1cd3-4905-84a8-9798dce108a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.481201 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.481245 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69454aac-1cd3-4905-84a8-9798dce108a6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.518589 4755 generic.go:334] "Generic (PLEG): container finished" podID="69454aac-1cd3-4905-84a8-9798dce108a6" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" exitCode=0 Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.518749 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519744 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"69454aac-1cd3-4905-84a8-9798dce108a6","Type":"ContainerDied","Data":"bb827d2d89dbfd64eb2d92983f09677c473496c78324bc9f6094cc48e4355ddf"} Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.519949 4755 scope.go:117] "RemoveContainer" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.551012 4755 scope.go:117] "RemoveContainer" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.560622 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.575932 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.584752 4755 scope.go:117] "RemoveContainer" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.590847 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599077 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599109 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599121 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599127 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599143 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599149 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.599160 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599166 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599338 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-notification-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599346 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="sg-core" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599353 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="ceilometer-central-agent" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.599364 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" containerName="proxy-httpd" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.600939 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.617941 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.618341 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.618478 4755 scope.go:117] "RemoveContainer" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.639850 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.656390 4755 scope.go:117] "RemoveContainer" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.656968 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": container with ID starting with 89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378 not found: ID does not exist" containerID="89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.656996 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378"} err="failed to get container status \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": rpc error: code = NotFound desc = could not find container \"89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378\": container with ID starting with 89571582f60d391b36b2a039b7395ef411c1dac4153779bd96a8ce9e6950b378 not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657017 4755 scope.go:117] "RemoveContainer" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.657616 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": container with ID starting with c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d not found: ID does not exist" containerID="c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657633 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d"} err="failed to get container status \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": rpc error: code = NotFound desc = could not find container \"c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d\": container with ID starting with c34ba2ec41165163490b788f50278831017492548bb714685b760814708fca0d not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657645 4755 scope.go:117] "RemoveContainer" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.657815 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": container with ID starting with 65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb not found: ID does not exist" containerID="65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657828 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb"} err="failed to get container status \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": rpc error: code = NotFound desc = could not find container \"65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb\": container with ID starting with 65919bbfd3f574974f67fb6bd12e3a3c6a8cf3e244bdbdaa293cb99ca8d4aedb not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.657839 4755 scope.go:117] "RemoveContainer" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: E0320 13:51:45.660920 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": container with ID starting with bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4 not found: ID does not exist" containerID="bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.660957 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4"} err="failed to get container status \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": rpc error: code = NotFound desc = could not find container \"bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4\": container with ID starting with bb8c695cb776e00fc2569d1ef84ab7a185015a375fca7e3a17d95c96015b65c4 not found: ID does not exist" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.688877 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689319 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689470 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689585 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689711 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.689847 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791605 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791672 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791708 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791763 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791810 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791840 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.791867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.792214 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.792282 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.795328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.795720 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.796007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.796596 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.815488 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"ceilometer-0\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " pod="openstack/ceilometer-0" Mar 20 13:51:45 crc kubenswrapper[4755]: I0320 13:51:45.936691 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:46 crc kubenswrapper[4755]: I0320 13:51:46.402044 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:46 crc kubenswrapper[4755]: I0320 13:51:46.529664 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"c25884d58a94c709373c3a5fa0d84cf0c5b160aab2b5f055314dff5c1917244c"} Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.236811 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69454aac-1cd3-4905-84a8-9798dce108a6" path="/var/lib/kubelet/pods/69454aac-1cd3-4905-84a8-9798dce108a6/volumes" Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.376477 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:47 crc kubenswrapper[4755]: I0320 13:51:47.543256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} Mar 20 13:51:48 crc kubenswrapper[4755]: I0320 13:51:48.553026 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} Mar 20 13:51:49 crc kubenswrapper[4755]: I0320 13:51:49.563909 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerStarted","Data":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601949 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" containerID="cri-o://7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602020 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" containerID="cri-o://3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602198 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" containerID="cri-o://d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.602379 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.601960 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" containerID="cri-o://3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" gracePeriod=30 Mar 20 13:51:52 crc kubenswrapper[4755]: I0320 13:51:52.654864 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.589498097 podStartE2EDuration="7.654844766s" podCreationTimestamp="2026-03-20 13:51:45 +0000 UTC" firstStartedPulling="2026-03-20 13:51:46.403155589 +0000 UTC m=+1286.001088138" lastFinishedPulling="2026-03-20 13:51:51.468502268 +0000 UTC m=+1291.066434807" observedRunningTime="2026-03-20 13:51:52.641347158 +0000 UTC m=+1292.239279707" watchObservedRunningTime="2026-03-20 13:51:52.654844766 +0000 UTC m=+1292.252777285" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.347679 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.449880 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.449932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450044 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450082 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450210 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450281 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") pod \"1221c1db-7d43-4307-928b-1360577fcbe7\" (UID: \"1221c1db-7d43-4307-928b-1360577fcbe7\") " Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450813 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.450955 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.456412 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt" (OuterVolumeSpecName: "kube-api-access-ssgtt") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "kube-api-access-ssgtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.456490 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts" (OuterVolumeSpecName: "scripts") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.487083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.523218 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.547118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data" (OuterVolumeSpecName: "config-data") pod "1221c1db-7d43-4307-928b-1360577fcbe7" (UID: "1221c1db-7d43-4307-928b-1360577fcbe7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552479 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552511 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552528 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552539 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552552 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1221c1db-7d43-4307-928b-1360577fcbe7-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552562 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1221c1db-7d43-4307-928b-1360577fcbe7-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.552575 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssgtt\" (UniqueName: \"kubernetes.io/projected/1221c1db-7d43-4307-928b-1360577fcbe7-kube-api-access-ssgtt\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616851 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616897 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" exitCode=2 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616907 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616914 4755 generic.go:334] "Generic (PLEG): container finished" podID="1221c1db-7d43-4307-928b-1360577fcbe7" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" exitCode=0 Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616906 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616958 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.616989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617002 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1221c1db-7d43-4307-928b-1360577fcbe7","Type":"ContainerDied","Data":"c25884d58a94c709373c3a5fa0d84cf0c5b160aab2b5f055314dff5c1917244c"} Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617006 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.617024 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.637451 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.681699 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.690936 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.706868 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.717351 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.725787 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726493 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.726598 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726734 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.726808 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.726907 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727008 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.727099 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727164 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727440 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-central-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727544 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="sg-core" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727615 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="proxy-httpd" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.727718 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" containerName="ceilometer-notification-agent" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.731026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.738613 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.739553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.739553 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.809748 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.810361 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.810428 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.810471 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811028 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811063 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811086 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811438 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811475 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811493 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: E0320 13:51:53.811804 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811839 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.811866 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812221 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812244 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812575 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812596 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812842 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.812860 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813053 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813097 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813317 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813335 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813566 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813591 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813850 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.813874 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814311 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814333 4755 scope.go:117] "RemoveContainer" containerID="7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814563 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480"} err="failed to get container status \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": rpc error: code = NotFound desc = could not find container \"7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480\": container with ID starting with 7019c9f8d845fd2860cdf254c1e65af747d2630f4b1ca018664f0a0aa0868480 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814582 4755 scope.go:117] "RemoveContainer" containerID="3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814792 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212"} err="failed to get container status \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": rpc error: code = NotFound desc = could not find container \"3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212\": container with ID starting with 3810d4c4d3405a958a8cec24b67f603064b42fac7624cceb3a552b0307692212 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.814814 4755 scope.go:117] "RemoveContainer" containerID="d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815006 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5"} err="failed to get container status \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": rpc error: code = NotFound desc = could not find container \"d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5\": container with ID starting with d2fe1aa5fbb9e06c49340b7cbac828af46cc02d5d11c832afc227851262b97e5 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815024 4755 scope.go:117] "RemoveContainer" containerID="3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.815265 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7"} err="failed to get container status \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": rpc error: code = NotFound desc = could not find container \"3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7\": container with ID starting with 3152a6579563913d3110161e522bacc842b34e90ba06497ca5abfbca867a14b7 not found: ID does not exist" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857640 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857733 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857754 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857814 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857864 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.857934 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959690 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959733 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959791 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959818 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.959843 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.960387 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.960626 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.965645 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.966572 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.967173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.967361 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:53 crc kubenswrapper[4755]: I0320 13:51:53.980801 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"ceilometer-0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " pod="openstack/ceilometer-0" Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.112553 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:51:54 crc kubenswrapper[4755]: W0320 13:51:54.605566 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod886eb096_8aa3_423b_b611_03cc592de1d0.slice/crio-05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29 WatchSource:0}: Error finding container 05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29: Status 404 returned error can't find the container with id 05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29 Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.607139 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.631222 4755 generic.go:334] "Generic (PLEG): container finished" podID="faef786e-b221-4fff-8d48-42b8163ed86a" containerID="109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28" exitCode=0 Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.631293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerDied","Data":"109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28"} Mar 20 13:51:54 crc kubenswrapper[4755]: I0320 13:51:54.634385 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29"} Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.243418 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1221c1db-7d43-4307-928b-1360577fcbe7" path="/var/lib/kubelet/pods/1221c1db-7d43-4307-928b-1360577fcbe7/volumes" Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.645911 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} Mar 20 13:51:55 crc kubenswrapper[4755]: I0320 13:51:55.994478 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108067 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108204 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108266 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.108318 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") pod \"faef786e-b221-4fff-8d48-42b8163ed86a\" (UID: \"faef786e-b221-4fff-8d48-42b8163ed86a\") " Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.113863 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts" (OuterVolumeSpecName: "scripts") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.116113 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj" (OuterVolumeSpecName: "kube-api-access-wf2pj") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "kube-api-access-wf2pj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.139880 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.178917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data" (OuterVolumeSpecName: "config-data") pod "faef786e-b221-4fff-8d48-42b8163ed86a" (UID: "faef786e-b221-4fff-8d48-42b8163ed86a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211757 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211815 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211831 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faef786e-b221-4fff-8d48-42b8163ed86a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.211848 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf2pj\" (UniqueName: \"kubernetes.io/projected/faef786e-b221-4fff-8d48-42b8163ed86a-kube-api-access-wf2pj\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.658568 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" event={"ID":"faef786e-b221-4fff-8d48-42b8163ed86a","Type":"ContainerDied","Data":"468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467"} Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.660118 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="468656ee4a41e081bb83f9b6787c1cd626ff1c56d93535dd53334282f9518467" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.660171 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-mbd9g" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.665447 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.745944 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:56 crc kubenswrapper[4755]: E0320 13:51:56.746637 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.746780 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.747149 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" containerName="nova-cell0-conductor-db-sync" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.747948 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.750418 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.752359 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-xvvcw" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.760951 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825407 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825459 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.825515 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927601 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927644 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.927707 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.932868 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.933193 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/676b01c6-a64d-4530-b157-10160afd719a-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:56 crc kubenswrapper[4755]: I0320 13:51:56.947556 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqr2c\" (UniqueName: \"kubernetes.io/projected/676b01c6-a64d-4530-b157-10160afd719a-kube-api-access-jqr2c\") pod \"nova-cell0-conductor-0\" (UID: \"676b01c6-a64d-4530-b157-10160afd719a\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.066923 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.539293 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:51:57 crc kubenswrapper[4755]: W0320 13:51:57.540382 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod676b01c6_a64d_4530_b157_10160afd719a.slice/crio-622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66 WatchSource:0}: Error finding container 622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66: Status 404 returned error can't find the container with id 622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66 Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.680154 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} Mar 20 13:51:57 crc kubenswrapper[4755]: I0320 13:51:57.681789 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"676b01c6-a64d-4530-b157-10160afd719a","Type":"ContainerStarted","Data":"622ab07a55ba152314421e8fafcb3987628df52f5f2e5d645912f8d32838cd66"} Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.706371 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"676b01c6-a64d-4530-b157-10160afd719a","Type":"ContainerStarted","Data":"6a245ce1a50d214ae90c1d6f845a8ed04a969d3c3195667e4091a43718f806b1"} Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.708937 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:51:58 crc kubenswrapper[4755]: I0320 13:51:58.736292 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.736273026 podStartE2EDuration="2.736273026s" podCreationTimestamp="2026-03-20 13:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:58.72372272 +0000 UTC m=+1298.321655289" watchObservedRunningTime="2026-03-20 13:51:58.736273026 +0000 UTC m=+1298.334205555" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.133438 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.135026 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.139615 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.139937 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.140042 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.144710 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.203998 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.306013 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.326839 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"auto-csr-approver-29566912-cl4dk\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.453951 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.726575 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerStarted","Data":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.727168 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.771340 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.957171275 podStartE2EDuration="7.77131784s" podCreationTimestamp="2026-03-20 13:51:53 +0000 UTC" firstStartedPulling="2026-03-20 13:51:54.610406502 +0000 UTC m=+1294.208339071" lastFinishedPulling="2026-03-20 13:51:59.424553087 +0000 UTC m=+1299.022485636" observedRunningTime="2026-03-20 13:52:00.768098917 +0000 UTC m=+1300.366031446" watchObservedRunningTime="2026-03-20 13:52:00.77131784 +0000 UTC m=+1300.369250369" Mar 20 13:52:00 crc kubenswrapper[4755]: I0320 13:52:00.960742 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:52:01 crc kubenswrapper[4755]: I0320 13:52:01.737495 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerStarted","Data":"b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f"} Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.117297 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.577494 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.578583 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.582337 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.591985 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.607171 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.665882 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666226 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666272 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.666305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.757207 4755 generic.go:334] "Generic (PLEG): container finished" podID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerID="72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608" exitCode=0 Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.757274 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerDied","Data":"72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608"} Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.769996 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770070 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770112 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.770145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.785336 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.787365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.798294 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.803574 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"nova-cell0-cell-mapping-vz8fw\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.837819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.842779 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.857266 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.860436 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.902869 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.952519 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.954170 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.955507 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.958725 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.981813 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.991017 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996346 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996764 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996854 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:02 crc kubenswrapper[4755]: I0320 13:52:02.996926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.036863 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.107595 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108031 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108056 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108098 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108118 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108175 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108201 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108229 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.108256 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.122511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.137039 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.138682 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.139930 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.143028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.144755 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.161554 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.206100 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211061 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211143 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211177 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211227 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211254 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211286 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211365 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211400 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.211460 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.216489 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.230095 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.246365 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.253131 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.271722 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.273594 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.274560 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.298126 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.301348 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.301470 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314159 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314865 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.314927 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.319338 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.323290 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.326950 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.342474 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"nova-metadata-0\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.381632 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.411104 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.425914 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.425963 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426094 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426141 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426169 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.426305 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545309 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545361 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545397 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545602 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.545619 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.547173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.548105 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.549113 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.550007 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.550812 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.598673 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"dnsmasq-dns-bccf8f775-726q6\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.607638 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.633544 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:03 crc kubenswrapper[4755]: W0320 13:52:03.699048 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ff73477_b65b_4362_938c_94b1bb1f51b0.slice/crio-d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73 WatchSource:0}: Error finding container d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73: Status 404 returned error can't find the container with id d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73 Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.702957 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.808057 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerStarted","Data":"d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73"} Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.910350 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.914331 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.917503 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.921121 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.924630 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:52:03 crc kubenswrapper[4755]: I0320 13:52:03.930955 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070317 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070841 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070872 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.070926 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.075190 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172728 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172781 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.172867 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.186974 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.187972 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.189215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.196215 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.210351 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"nova-cell1-conductor-db-sync-qbtvj\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.320783 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.326488 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.411775 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.417446 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:04 crc kubenswrapper[4755]: W0320 13:52:04.453737 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24dab838_4670_45f3_8276_240f4266194d.slice/crio-a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9 WatchSource:0}: Error finding container a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9: Status 404 returned error can't find the container with id a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.587517 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") pod \"78cf774b-eb80-4f5b-a7de-2012636d36c5\" (UID: \"78cf774b-eb80-4f5b-a7de-2012636d36c5\") " Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.594807 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz" (OuterVolumeSpecName: "kube-api-access-jx4bz") pod "78cf774b-eb80-4f5b-a7de-2012636d36c5" (UID: "78cf774b-eb80-4f5b-a7de-2012636d36c5"). InnerVolumeSpecName "kube-api-access-jx4bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.690171 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx4bz\" (UniqueName: \"kubernetes.io/projected/78cf774b-eb80-4f5b-a7de-2012636d36c5-kube-api-access-jx4bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.879023 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerStarted","Data":"4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.891675 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerStarted","Data":"07fce25f45c6fe707885052798e47cbce52b19aa7717044f52d0bb81703a15ba"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.912631 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vz8fw" podStartSLOduration=2.912597094 podStartE2EDuration="2.912597094s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:04.899047113 +0000 UTC m=+1304.496979642" watchObservedRunningTime="2026-03-20 13:52:04.912597094 +0000 UTC m=+1304.510529623" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.912942 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.920834 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerStarted","Data":"ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930185 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930196 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-cl4dk" event={"ID":"78cf774b-eb80-4f5b-a7de-2012636d36c5","Type":"ContainerDied","Data":"b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.930243 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07077c65d7a8ce69c944552b9a779349ccdc8a2914370a0af3c251b613f1c6f" Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.931861 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.936988 4755 generic.go:334] "Generic (PLEG): container finished" podID="24dab838-4670-45f3-8276-240f4266194d" containerID="1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16" exitCode=0 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.937175 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16"} Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.937258 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerStarted","Data":"a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9"} Mar 20 13:52:04 crc kubenswrapper[4755]: W0320 13:52:04.939227 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcadbdc7c_ed66_43d7_82ee_d797beb959a8.slice/crio-01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636 WatchSource:0}: Error finding container 01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636: Status 404 returned error can't find the container with id 01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636 Mar 20 13:52:04 crc kubenswrapper[4755]: I0320 13:52:04.955894 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"d305ab31e9622ed372defac60b08de6826298aff9ba6d4afc85a9d25d074e86a"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.500209 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.522208 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-gwzkg"] Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.981041 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerStarted","Data":"8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.981292 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerStarted","Data":"01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636"} Mar 20 13:52:05 crc kubenswrapper[4755]: I0320 13:52:05.990748 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerStarted","Data":"62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d"} Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.010515 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" podStartSLOduration=3.010486787 podStartE2EDuration="3.010486787s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:06.003302282 +0000 UTC m=+1305.601234831" watchObservedRunningTime="2026-03-20 13:52:06.010486787 +0000 UTC m=+1305.608419326" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.035108 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-726q6" podStartSLOduration=3.035085355 podStartE2EDuration="3.035085355s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:06.030109896 +0000 UTC m=+1305.628042455" watchObservedRunningTime="2026-03-20 13:52:06.035085355 +0000 UTC m=+1305.633017884" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.651277 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.661313 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.751927 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.752018 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:06 crc kubenswrapper[4755]: I0320 13:52:06.999361 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:07 crc kubenswrapper[4755]: I0320 13:52:07.256057 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47f9ab28-1218-4dcf-a989-728b9063a3e9" path="/var/lib/kubelet/pods/47f9ab28-1218-4dcf-a989-728b9063a3e9/volumes" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.007526 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.008005 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerStarted","Data":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.009898 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerStarted","Data":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.018213 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.021020 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" gracePeriod=30 Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.021325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerStarted","Data":"56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0"} Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.043363 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.783444717 podStartE2EDuration="6.043333235s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.090688149 +0000 UTC m=+1303.688620678" lastFinishedPulling="2026-03-20 13:52:07.350576667 +0000 UTC m=+1306.948509196" observedRunningTime="2026-03-20 13:52:08.032375361 +0000 UTC m=+1307.630307890" watchObservedRunningTime="2026-03-20 13:52:08.043333235 +0000 UTC m=+1307.641265764" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.129336 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.968323458 podStartE2EDuration="6.129310843s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.188561165 +0000 UTC m=+1303.786493694" lastFinishedPulling="2026-03-20 13:52:07.34954855 +0000 UTC m=+1306.947481079" observedRunningTime="2026-03-20 13:52:08.095105447 +0000 UTC m=+1307.693037976" watchObservedRunningTime="2026-03-20 13:52:08.129310843 +0000 UTC m=+1307.727243372" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.131140 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.71486039 podStartE2EDuration="6.131133559s" podCreationTimestamp="2026-03-20 13:52:02 +0000 UTC" firstStartedPulling="2026-03-20 13:52:03.929332109 +0000 UTC m=+1303.527264638" lastFinishedPulling="2026-03-20 13:52:07.345605278 +0000 UTC m=+1306.943537807" observedRunningTime="2026-03-20 13:52:08.125154205 +0000 UTC m=+1307.723086734" watchObservedRunningTime="2026-03-20 13:52:08.131133559 +0000 UTC m=+1307.729066088" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.207146 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:52:08 crc kubenswrapper[4755]: I0320 13:52:08.411788 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.034589 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerStarted","Data":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.035014 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" containerID="cri-o://f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" gracePeriod=30 Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.035022 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" containerID="cri-o://9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" gracePeriod=30 Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.076709 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.058379867 podStartE2EDuration="6.076646666s" podCreationTimestamp="2026-03-20 13:52:03 +0000 UTC" firstStartedPulling="2026-03-20 13:52:04.332221216 +0000 UTC m=+1303.930153745" lastFinishedPulling="2026-03-20 13:52:07.350488015 +0000 UTC m=+1306.948420544" observedRunningTime="2026-03-20 13:52:09.065928188 +0000 UTC m=+1308.663860717" watchObservedRunningTime="2026-03-20 13:52:09.076646666 +0000 UTC m=+1308.674579195" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.723160 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832427 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832533 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832644 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.832855 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") pod \"d4601b20-9dc6-41dd-ab44-f10600003906\" (UID: \"d4601b20-9dc6-41dd-ab44-f10600003906\") " Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.833639 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs" (OuterVolumeSpecName: "logs") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.854053 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v" (OuterVolumeSpecName: "kube-api-access-5vf8v") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "kube-api-access-5vf8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.882067 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.883394 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data" (OuterVolumeSpecName: "config-data") pod "d4601b20-9dc6-41dd-ab44-f10600003906" (UID: "d4601b20-9dc6-41dd-ab44-f10600003906"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936002 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4601b20-9dc6-41dd-ab44-f10600003906-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936036 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936047 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4601b20-9dc6-41dd-ab44-f10600003906-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4755]: I0320 13:52:09.936057 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vf8v\" (UniqueName: \"kubernetes.io/projected/d4601b20-9dc6-41dd-ab44-f10600003906-kube-api-access-5vf8v\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069371 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4601b20-9dc6-41dd-ab44-f10600003906" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" exitCode=0 Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069471 4755 generic.go:334] "Generic (PLEG): container finished" podID="d4601b20-9dc6-41dd-ab44-f10600003906" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" exitCode=143 Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069629 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069816 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d4601b20-9dc6-41dd-ab44-f10600003906","Type":"ContainerDied","Data":"1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753"} Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.069840 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.094885 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.123911 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.124738 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.124865 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} err="failed to get container status \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.124921 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.125362 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.125413 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} err="failed to get container status \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.125446 4755 scope.go:117] "RemoveContainer" containerID="9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126190 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66"} err="failed to get container status \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": rpc error: code = NotFound desc = could not find container \"9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66\": container with ID starting with 9295f01c8f5f0ed05d04af20d2318a7c69f831272f3ca90eb8fc5af764a6bc66 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126255 4755 scope.go:117] "RemoveContainer" containerID="f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.126642 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28"} err="failed to get container status \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": rpc error: code = NotFound desc = could not find container \"f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28\": container with ID starting with f6280e6b10010d9afeb3b0552aba6945ebb7e4c8d1c111c2b5e59c036d5eab28 not found: ID does not exist" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.142232 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.148943 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.164561 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.164982 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.164998 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.165018 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165025 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.165033 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165039 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165213 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-log" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165223 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" containerName="oc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.165237 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" containerName="nova-metadata-metadata" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.166212 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179307 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179512 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.179793 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.246567 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.246765 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247062 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247157 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.247726 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: E0320 13:52:10.331030 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4601b20_9dc6_41dd_ab44_f10600003906.slice/crio-1a97c2ff93adc00e2962e3ca60c754b0172a77338886546dec9012549fe43753\": RecentStats: unable to find data in memory cache]" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349506 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349561 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349642 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.349787 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.350236 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.350420 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.354328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.354521 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.369018 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.380223 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"nova-metadata-0\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " pod="openstack/nova-metadata-0" Mar 20 13:52:10 crc kubenswrapper[4755]: I0320 13:52:10.509090 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.051328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.090542 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"7a21afc95b0624df6ce1d9bc13f6bf0f3fd81506690ec7d242284e3e4ee61373"} Mar 20 13:52:11 crc kubenswrapper[4755]: I0320 13:52:11.239788 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4601b20-9dc6-41dd-ab44-f10600003906" path="/var/lib/kubelet/pods/d4601b20-9dc6-41dd-ab44-f10600003906/volumes" Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.103153 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.103495 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerStarted","Data":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} Mar 20 13:52:12 crc kubenswrapper[4755]: I0320 13:52:12.132499 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.132472547 podStartE2EDuration="2.132472547s" podCreationTimestamp="2026-03-20 13:52:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:12.124416988 +0000 UTC m=+1311.722349537" watchObservedRunningTime="2026-03-20 13:52:12.132472547 +0000 UTC m=+1311.730405096" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.115939 4755 generic.go:334] "Generic (PLEG): container finished" podID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerID="4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a" exitCode=0 Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.116048 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerDied","Data":"4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a"} Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.206942 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.246303 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.382588 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.382634 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.635698 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.698746 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:13 crc kubenswrapper[4755]: I0320 13:52:13.698979 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-656mk" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" containerID="cri-o://fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" gracePeriod=10 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.130407 4755 generic.go:334] "Generic (PLEG): container finished" podID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerID="8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718" exitCode=0 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.130576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerDied","Data":"8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718"} Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.137121 4755 generic.go:334] "Generic (PLEG): container finished" podID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerID="fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" exitCode=0 Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.138031 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437"} Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.177164 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.288906 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.340610 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341088 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341336 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341383 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.341464 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") pod \"f4e36ff1-5396-4e15-ad2f-6312bc653076\" (UID: \"f4e36ff1-5396-4e15-ad2f-6312bc653076\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.358129 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6" (OuterVolumeSpecName: "kube-api-access-czrk6") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "kube-api-access-czrk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.444306 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czrk6\" (UniqueName: \"kubernetes.io/projected/f4e36ff1-5396-4e15-ad2f-6312bc653076-kube-api-access-czrk6\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.444543 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.446186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.450441 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.465855 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.465887 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.193:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.488399 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config" (OuterVolumeSpecName: "config") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.493617 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.497694 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4e36ff1-5396-4e15-ad2f-6312bc653076" (UID: "f4e36ff1-5396-4e15-ad2f-6312bc653076"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545339 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545395 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545447 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.545600 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") pod \"2ff73477-b65b-4362-938c-94b1bb1f51b0\" (UID: \"2ff73477-b65b-4362-938c-94b1bb1f51b0\") " Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546030 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546056 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546071 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546084 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.546096 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4e36ff1-5396-4e15-ad2f-6312bc653076-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.548992 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts" (OuterVolumeSpecName: "scripts") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.551370 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc" (OuterVolumeSpecName: "kube-api-access-jz6sc") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "kube-api-access-jz6sc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.576213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.584971 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data" (OuterVolumeSpecName: "config-data") pod "2ff73477-b65b-4362-938c-94b1bb1f51b0" (UID: "2ff73477-b65b-4362-938c-94b1bb1f51b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648269 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648305 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648317 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz6sc\" (UniqueName: \"kubernetes.io/projected/2ff73477-b65b-4362-938c-94b1bb1f51b0-kube-api-access-jz6sc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:14 crc kubenswrapper[4755]: I0320 13:52:14.648326 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ff73477-b65b-4362-938c-94b1bb1f51b0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157362 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-656mk" event={"ID":"f4e36ff1-5396-4e15-ad2f-6312bc653076","Type":"ContainerDied","Data":"0ac451aa4ed8d677d77946a6a4c4490aa16c5aad1720a8d22a9ecfc0acddbe6e"} Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157427 4755 scope.go:117] "RemoveContainer" containerID="fc9fdaf0d4ddb2a2717a4ecdef08718379b659c8a91930a404bc7d0be2c15437" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.157606 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-656mk" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.176826 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vz8fw" event={"ID":"2ff73477-b65b-4362-938c-94b1bb1f51b0","Type":"ContainerDied","Data":"d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73"} Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.177346 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d32aa5deb6f67b7a188b3ac36319ad6c1939a9c26f528a46388853fbd890db73" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.176960 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vz8fw" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.214482 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.226030 4755 scope.go:117] "RemoveContainer" containerID="74077c1fbfd38abf1631b5d13dad67c803ceb01aef73d54463a17c47b433408b" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.235747 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-656mk"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357018 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357237 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" containerID="cri-o://51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.357786 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" containerID="cri-o://a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.398204 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.445022 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.445466 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" containerID="cri-o://786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.446138 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" containerID="cri-o://b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" gracePeriod=30 Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.555029 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.679908 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680000 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680148 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.680246 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") pod \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\" (UID: \"cadbdc7c-ed66-43d7-82ee-d797beb959a8\") " Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.687455 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p" (OuterVolumeSpecName: "kube-api-access-ccz2p") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "kube-api-access-ccz2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.697844 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts" (OuterVolumeSpecName: "scripts") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.714768 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.718841 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data" (OuterVolumeSpecName: "config-data") pod "cadbdc7c-ed66-43d7-82ee-d797beb959a8" (UID: "cadbdc7c-ed66-43d7-82ee-d797beb959a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782812 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782889 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782903 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ccz2p\" (UniqueName: \"kubernetes.io/projected/cadbdc7c-ed66-43d7-82ee-d797beb959a8-kube-api-access-ccz2p\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:15 crc kubenswrapper[4755]: I0320 13:52:15.782918 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cadbdc7c-ed66-43d7-82ee-d797beb959a8-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.033624 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088242 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088351 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088406 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088488 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.088603 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") pod \"56052c23-c9d5-4eba-9696-13d244f6cf97\" (UID: \"56052c23-c9d5-4eba-9696-13d244f6cf97\") " Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.089403 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs" (OuterVolumeSpecName: "logs") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.089913 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56052c23-c9d5-4eba-9696-13d244f6cf97-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.095598 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv" (OuterVolumeSpecName: "kube-api-access-2f9jv") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "kube-api-access-2f9jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.126050 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data" (OuterVolumeSpecName: "config-data") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.133093 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.148568 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "56052c23-c9d5-4eba-9696-13d244f6cf97" (UID: "56052c23-c9d5-4eba-9696-13d244f6cf97"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194774 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194876 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194893 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2f9jv\" (UniqueName: \"kubernetes.io/projected/56052c23-c9d5-4eba-9696-13d244f6cf97-kube-api-access-2f9jv\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.194936 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/56052c23-c9d5-4eba-9696-13d244f6cf97-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.197380 4755 generic.go:334] "Generic (PLEG): container finished" podID="9039b999-a68c-4920-af85-ac61d8509b06" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" exitCode=143 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.197398 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201889 4755 generic.go:334] "Generic (PLEG): container finished" podID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" exitCode=0 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201911 4755 generic.go:334] "Generic (PLEG): container finished" podID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" exitCode=143 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201949 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201957 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.201994 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56052c23-c9d5-4eba-9696-13d244f6cf97","Type":"ContainerDied","Data":"7a21afc95b0624df6ce1d9bc13f6bf0f3fd81506690ec7d242284e3e4ee61373"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.202008 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204304 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" event={"ID":"cadbdc7c-ed66-43d7-82ee-d797beb959a8","Type":"ContainerDied","Data":"01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636"} Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204341 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01be3a51f70e0f73a891f6548879c583c460ee7b7f4aaa596942e2ffe8ef5636" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204348 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbtvj" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.204420 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" containerID="cri-o://e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" gracePeriod=30 Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.237057 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245160 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245720 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245744 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245760 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245768 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245786 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245793 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245817 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245836 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245852 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="init" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245860 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="init" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.245870 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.245878 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246076 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" containerName="nova-manage" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246098 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-metadata" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246109 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" containerName="nova-metadata-log" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246135 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" containerName="nova-cell1-conductor-db-sync" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246155 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" containerName="dnsmasq-dns" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.246896 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.250225 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.273091 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.283622 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.287627 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.288152 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.288191 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} err="failed to get container status \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.288219 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: E0320 13:52:16.293764 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293813 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} err="failed to get container status \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293847 4755 scope.go:117] "RemoveContainer" containerID="b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.293977 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294391 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb"} err="failed to get container status \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": rpc error: code = NotFound desc = could not find container \"b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb\": container with ID starting with b5a04ba75f7cdc9eae7cea50cbd360f66b23792154a3c20bd2a7ceeff1ad7dbb not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294434 4755 scope.go:117] "RemoveContainer" containerID="786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.294917 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158"} err="failed to get container status \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": rpc error: code = NotFound desc = could not find container \"786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158\": container with ID starting with 786f11c094e4c2faf90093669785a969b8040479c6e6fa12888b7f8802509158 not found: ID does not exist" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296070 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296119 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.296195 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.321793 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.323622 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.326081 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.326406 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.336864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398533 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398624 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398708 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398793 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398913 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.398951 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.402785 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.402862 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.415511 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnjr5\" (UniqueName: \"kubernetes.io/projected/32aa4c4f-3c67-46f5-90ae-59d17077eb1d-kube-api-access-hnjr5\") pod \"nova-cell1-conductor-0\" (UID: \"32aa4c4f-3c67-46f5-90ae-59d17077eb1d\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500553 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500623 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500683 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500718 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.500747 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.501278 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.505127 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.505272 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.506173 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.525467 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"nova-metadata-0\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " pod="openstack/nova-metadata-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.578616 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:16 crc kubenswrapper[4755]: I0320 13:52:16.641240 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.061102 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:52:17 crc kubenswrapper[4755]: W0320 13:52:17.067866 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32aa4c4f_3c67_46f5_90ae_59d17077eb1d.slice/crio-08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f WatchSource:0}: Error finding container 08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f: Status 404 returned error can't find the container with id 08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f Mar 20 13:52:17 crc kubenswrapper[4755]: W0320 13:52:17.179573 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc55c1b8_6ed7_41ba_b5a6_8fe3f03fe3c7.slice/crio-b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab WatchSource:0}: Error finding container b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab: Status 404 returned error can't find the container with id b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.181189 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.216768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab"} Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.218704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"32aa4c4f-3c67-46f5-90ae-59d17077eb1d","Type":"ContainerStarted","Data":"08018fc24a18c0983f5b7c8b9af70a8b09b11f3174f225011299e4bc656c8c5f"} Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.246437 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56052c23-c9d5-4eba-9696-13d244f6cf97" path="/var/lib/kubelet/pods/56052c23-c9d5-4eba-9696-13d244f6cf97/volumes" Mar 20 13:52:17 crc kubenswrapper[4755]: I0320 13:52:17.248086 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4e36ff1-5396-4e15-ad2f-6312bc653076" path="/var/lib/kubelet/pods/f4e36ff1-5396-4e15-ad2f-6312bc653076/volumes" Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.210137 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.212167 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.213567 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:52:18 crc kubenswrapper[4755]: E0320 13:52:18.213617 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.232427 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"32aa4c4f-3c67-46f5-90ae-59d17077eb1d","Type":"ContainerStarted","Data":"561e9d4a927852e54a364b78bdf7d50a740fd277b73deb5ad2d594176e1f9238"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.232637 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.239506 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.240063 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerStarted","Data":"b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7"} Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.262829 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.262805423 podStartE2EDuration="2.262805423s" podCreationTimestamp="2026-03-20 13:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:18.258748828 +0000 UTC m=+1317.856681417" watchObservedRunningTime="2026-03-20 13:52:18.262805423 +0000 UTC m=+1317.860737962" Mar 20 13:52:18 crc kubenswrapper[4755]: I0320 13:52:18.290888 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.29086228 podStartE2EDuration="2.29086228s" podCreationTimestamp="2026-03-20 13:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:18.288838207 +0000 UTC m=+1317.886770756" watchObservedRunningTime="2026-03-20 13:52:18.29086228 +0000 UTC m=+1317.888794849" Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.859820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.973890 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.973962 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.974158 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") pod \"5c488527-be33-4a36-a073-1a49802e28dd\" (UID: \"5c488527-be33-4a36-a073-1a49802e28dd\") " Mar 20 13:52:19 crc kubenswrapper[4755]: I0320 13:52:19.980257 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n" (OuterVolumeSpecName: "kube-api-access-lpf6n") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "kube-api-access-lpf6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.001882 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data" (OuterVolumeSpecName: "config-data") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.038509 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c488527-be33-4a36-a073-1a49802e28dd" (UID: "5c488527-be33-4a36-a073-1a49802e28dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075911 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpf6n\" (UniqueName: \"kubernetes.io/projected/5c488527-be33-4a36-a073-1a49802e28dd-kube-api-access-lpf6n\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075950 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.075960 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c488527-be33-4a36-a073-1a49802e28dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.252078 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.260907 4755 generic.go:334] "Generic (PLEG): container finished" podID="9039b999-a68c-4920-af85-ac61d8509b06" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" exitCode=0 Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261059 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261341 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261375 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9039b999-a68c-4920-af85-ac61d8509b06","Type":"ContainerDied","Data":"d305ab31e9622ed372defac60b08de6826298aff9ba6d4afc85a9d25d074e86a"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.261398 4755 scope.go:117] "RemoveContainer" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.264372 4755 generic.go:334] "Generic (PLEG): container finished" podID="5c488527-be33-4a36-a073-1a49802e28dd" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" exitCode=0 Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.265104 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.269584 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerDied","Data":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.269725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5c488527-be33-4a36-a073-1a49802e28dd","Type":"ContainerDied","Data":"07fce25f45c6fe707885052798e47cbce52b19aa7717044f52d0bb81703a15ba"} Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279507 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279614 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279798 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.279853 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") pod \"9039b999-a68c-4920-af85-ac61d8509b06\" (UID: \"9039b999-a68c-4920-af85-ac61d8509b06\") " Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.280750 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs" (OuterVolumeSpecName: "logs") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.287057 4755 scope.go:117] "RemoveContainer" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.288151 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85" (OuterVolumeSpecName: "kube-api-access-jwb85") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "kube-api-access-jwb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.319781 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.322359 4755 scope.go:117] "RemoveContainer" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.322365 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.323525 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": container with ID starting with a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5 not found: ID does not exist" containerID="a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.323587 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5"} err="failed to get container status \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": rpc error: code = NotFound desc = could not find container \"a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5\": container with ID starting with a1363307707c300f0c8d00f5fbbe6f376507ce1cacf940f8e6b890d6df5afde5 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.323624 4755 scope.go:117] "RemoveContainer" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.324669 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": container with ID starting with 51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831 not found: ID does not exist" containerID="51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.324736 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831"} err="failed to get container status \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": rpc error: code = NotFound desc = could not find container \"51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831\": container with ID starting with 51f51cd19ce802704345e2367428ecfbdd3ce717907744fbf2492b4930c09831 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.324758 4755 scope.go:117] "RemoveContainer" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.331680 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.339800 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data" (OuterVolumeSpecName: "config-data") pod "9039b999-a68c-4920-af85-ac61d8509b06" (UID: "9039b999-a68c-4920-af85-ac61d8509b06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.356871 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357416 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357433 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357455 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357464 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.357487 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357495 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357716 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c488527-be33-4a36-a073-1a49802e28dd" containerName="nova-scheduler-scheduler" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357739 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-log" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.357757 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9039b999-a68c-4920-af85-ac61d8509b06" containerName="nova-api-api" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.358528 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.360419 4755 scope.go:117] "RemoveContainer" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.361193 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:52:20 crc kubenswrapper[4755]: E0320 13:52:20.362122 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": container with ID starting with e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57 not found: ID does not exist" containerID="e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.362230 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57"} err="failed to get container status \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": rpc error: code = NotFound desc = could not find container \"e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57\": container with ID starting with e1339782c9ef52a4fb795c009566b2def88ea1e5c5f6a19a74bbcac41a602c57 not found: ID does not exist" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383170 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383410 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383491 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383505 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwb85\" (UniqueName: \"kubernetes.io/projected/9039b999-a68c-4920-af85-ac61d8509b06-kube-api-access-jwb85\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383521 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9039b999-a68c-4920-af85-ac61d8509b06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.383532 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9039b999-a68c-4920-af85-ac61d8509b06-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.486613 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.487282 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.487409 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.489849 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.491021 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.503695 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"nova-scheduler-0\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.640839 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.658982 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.676547 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.678264 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.678373 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.681013 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.685037 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.794810 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.794884 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.795007 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.795080 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896580 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896742 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896786 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.896941 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.898133 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.904197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:20 crc kubenswrapper[4755]: I0320 13:52:20.911180 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:20.921943 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"nova-api-0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.004036 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.182553 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:21 crc kubenswrapper[4755]: W0320 13:52:21.184837 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b22e0c3_341e_444d_a615_50d5ccdc9f12.slice/crio-0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82 WatchSource:0}: Error finding container 0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82: Status 404 returned error can't find the container with id 0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82 Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.240319 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c488527-be33-4a36-a073-1a49802e28dd" path="/var/lib/kubelet/pods/5c488527-be33-4a36-a073-1a49802e28dd/volumes" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.242019 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9039b999-a68c-4920-af85-ac61d8509b06" path="/var/lib/kubelet/pods/9039b999-a68c-4920-af85-ac61d8509b06/volumes" Mar 20 13:52:21 crc kubenswrapper[4755]: I0320 13:52:21.291029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerStarted","Data":"0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.088546 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.308586 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.308637 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"69613a5a056ceb190560dd090764a72a502dd1b9d118cfd002538e2432b55f6e"} Mar 20 13:52:22 crc kubenswrapper[4755]: I0320 13:52:22.310147 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerStarted","Data":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.325181 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerStarted","Data":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.357827 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.357795775 podStartE2EDuration="3.357795775s" podCreationTimestamp="2026-03-20 13:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:22.338126167 +0000 UTC m=+1321.936058696" watchObservedRunningTime="2026-03-20 13:52:23.357795775 +0000 UTC m=+1322.955728344" Mar 20 13:52:23 crc kubenswrapper[4755]: I0320 13:52:23.360910 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.360896155 podStartE2EDuration="3.360896155s" podCreationTimestamp="2026-03-20 13:52:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:23.346103862 +0000 UTC m=+1322.944036411" watchObservedRunningTime="2026-03-20 13:52:23.360896155 +0000 UTC m=+1322.958828724" Mar 20 13:52:24 crc kubenswrapper[4755]: I0320 13:52:24.125849 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:52:25 crc kubenswrapper[4755]: I0320 13:52:25.686352 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.610153 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.641680 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:52:26 crc kubenswrapper[4755]: I0320 13:52:26.641733 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.162188 4755 scope.go:117] "RemoveContainer" containerID="cdf7ecd10574feecd4a5c8a0aa5c3c10ad9149f00e749b3e48d52a3b5587a97f" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.657820 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:27 crc kubenswrapper[4755]: I0320 13:52:27.657835 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.200:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.233499 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.234241 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" containerID="cri-o://8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" gracePeriod=30 Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.381478 4755 generic.go:334] "Generic (PLEG): container finished" podID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerID="8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" exitCode=2 Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.381519 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerDied","Data":"8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28"} Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.782358 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.888752 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") pod \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\" (UID: \"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60\") " Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.897213 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw" (OuterVolumeSpecName: "kube-api-access-4tpmw") pod "5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" (UID: "5ac4bdab-eaee-4ee6-a3e1-2f754c179d60"). InnerVolumeSpecName "kube-api-access-4tpmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4755]: I0320 13:52:28.990762 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tpmw\" (UniqueName: \"kubernetes.io/projected/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60-kube-api-access-4tpmw\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392113 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5ac4bdab-eaee-4ee6-a3e1-2f754c179d60","Type":"ContainerDied","Data":"1f936cfbd135019d1572ee465a4fb61fade57721a1a7701a47ec15a9bf86c1cd"} Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392191 4755 scope.go:117] "RemoveContainer" containerID="8ce464205429ca09c78b8e5b5322b476894ef184cfd7f3e7208680387bf26d28" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.392199 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.422050 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.434278 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.438751 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: E0320 13:52:29.439228 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.439252 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.439480 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" containerName="kube-state-metrics" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.440140 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.449534 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.457395 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.461400 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499300 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499453 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499484 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.499525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601805 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601859 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601894 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.601954 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.615978 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.616013 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.616526 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f27a8a2-0755-47ae-a7b4-63787c8c9393-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.619089 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfvrd\" (UniqueName: \"kubernetes.io/projected/4f27a8a2-0755-47ae-a7b4-63787c8c9393-kube-api-access-cfvrd\") pod \"kube-state-metrics-0\" (UID: \"4f27a8a2-0755-47ae-a7b4-63787c8c9393\") " pod="openstack/kube-state-metrics-0" Mar 20 13:52:29 crc kubenswrapper[4755]: I0320 13:52:29.757922 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.209359 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210312 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" containerID="cri-o://d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210396 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" containerID="cri-o://d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210551 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" containerID="cri-o://a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.210410 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" containerID="cri-o://eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" gracePeriod=30 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.264287 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.401543 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f27a8a2-0755-47ae-a7b4-63787c8c9393","Type":"ContainerStarted","Data":"b92695a785998a573c5e5b231bdac4b7f5867d17cc52d242074f5f1bbe5bbca6"} Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.409606 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" exitCode=2 Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.409686 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.685938 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:52:30 crc kubenswrapper[4755]: I0320 13:52:30.718295 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.005550 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.005599 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.237904 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ac4bdab-eaee-4ee6-a3e1-2f754c179d60" path="/var/lib/kubelet/pods/5ac4bdab-eaee-4ee6-a3e1-2f754c179d60/volumes" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.421580 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.423089 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.421624 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.423309 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.425276 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4f27a8a2-0755-47ae-a7b4-63787c8c9393","Type":"ContainerStarted","Data":"37e1a33a39052d95fffee5fa310d247339624777ba841f4ced83f91cbf750277"} Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.425611 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.454570 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.073128355 podStartE2EDuration="2.454548516s" podCreationTimestamp="2026-03-20 13:52:29 +0000 UTC" firstStartedPulling="2026-03-20 13:52:30.278029115 +0000 UTC m=+1329.875961644" lastFinishedPulling="2026-03-20 13:52:30.659449286 +0000 UTC m=+1330.257381805" observedRunningTime="2026-03-20 13:52:31.442449413 +0000 UTC m=+1331.040381942" watchObservedRunningTime="2026-03-20 13:52:31.454548516 +0000 UTC m=+1331.052481055" Mar 20 13:52:31 crc kubenswrapper[4755]: I0320 13:52:31.491568 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:52:32 crc kubenswrapper[4755]: I0320 13:52:32.087879 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:32 crc kubenswrapper[4755]: I0320 13:52:32.087919 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:52:34 crc kubenswrapper[4755]: I0320 13:52:34.642412 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:52:34 crc kubenswrapper[4755]: I0320 13:52:34.642941 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.435952 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492630 4755 generic.go:334] "Generic (PLEG): container finished" podID="886eb096-8aa3-423b-b611-03cc592de1d0" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" exitCode=0 Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492708 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492743 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"886eb096-8aa3-423b-b611-03cc592de1d0","Type":"ContainerDied","Data":"05380ce5bc1f7a737645adb8b43e7705ec87079192f12498b408a2f6bb56aa29"} Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492763 4755 scope.go:117] "RemoveContainer" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.492963 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.519941 4755 scope.go:117] "RemoveContainer" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525064 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525157 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525192 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525215 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525247 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525340 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525456 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") pod \"886eb096-8aa3-423b-b611-03cc592de1d0\" (UID: \"886eb096-8aa3-423b-b611-03cc592de1d0\") " Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525495 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.525878 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.526854 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.540497 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs" (OuterVolumeSpecName: "kube-api-access-5n5fs") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "kube-api-access-5n5fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.542267 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts" (OuterVolumeSpecName: "scripts") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.557258 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.560018 4755 scope.go:117] "RemoveContainer" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.607640 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630827 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/886eb096-8aa3-423b-b611-03cc592de1d0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630856 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630866 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630875 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.630884 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n5fs\" (UniqueName: \"kubernetes.io/projected/886eb096-8aa3-423b-b611-03cc592de1d0-kube-api-access-5n5fs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.635403 4755 scope.go:117] "RemoveContainer" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.641348 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data" (OuterVolumeSpecName: "config-data") pod "886eb096-8aa3-423b-b611-03cc592de1d0" (UID: "886eb096-8aa3-423b-b611-03cc592de1d0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662212 4755 scope.go:117] "RemoveContainer" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.662508 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": container with ID starting with eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398 not found: ID does not exist" containerID="eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662533 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398"} err="failed to get container status \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": rpc error: code = NotFound desc = could not find container \"eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398\": container with ID starting with eedb5c160bd74c8b5c0ab3836fd8cdfbef6eecc6f8ae11ce73e8fc3a4495e398 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662556 4755 scope.go:117] "RemoveContainer" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.662919 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": container with ID starting with d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44 not found: ID does not exist" containerID="d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.662978 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44"} err="failed to get container status \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": rpc error: code = NotFound desc = could not find container \"d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44\": container with ID starting with d1e5e3991442d8ddb69c3c63096442ecdfdde3ea3d74740a1ba3a44648bb4f44 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663034 4755 scope.go:117] "RemoveContainer" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.663549 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": container with ID starting with a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0 not found: ID does not exist" containerID="a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663572 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0"} err="failed to get container status \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": rpc error: code = NotFound desc = could not find container \"a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0\": container with ID starting with a718c7c7b04175b4a7b635b2749ffadeec561d96d2f0bdc30593280a75b47dc0 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663587 4755 scope.go:117] "RemoveContainer" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.663824 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": container with ID starting with d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5 not found: ID does not exist" containerID="d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.663859 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5"} err="failed to get container status \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": rpc error: code = NotFound desc = could not find container \"d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5\": container with ID starting with d91fe7e194f4539db2b87ae3cb16bccc88564a1be2bb77dd9272bf75c96419a5 not found: ID does not exist" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.732305 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/886eb096-8aa3-423b-b611-03cc592de1d0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.830520 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.838269 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.866617 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867069 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867093 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867109 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867115 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867127 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867134 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: E0320 13:52:35.867163 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867169 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867487 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="sg-core" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867514 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-central-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867526 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="ceilometer-notification-agent" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.867537 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" containerName="proxy-httpd" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.871231 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.873295 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.873510 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.875914 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.910539 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935505 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935588 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935620 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.935642 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936017 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936163 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936207 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:35 crc kubenswrapper[4755]: I0320 13:52:35.936377 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037821 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037907 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037930 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.037976 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038020 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038099 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038124 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038525 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.038609 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.041984 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.042163 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.043438 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.044197 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.046546 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.057206 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ceilometer-0\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.193866 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.647312 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.692368 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.693342 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.751498 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.751567 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:52:36 crc kubenswrapper[4755]: I0320 13:52:36.774541 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.236582 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="886eb096-8aa3-423b-b611-03cc592de1d0" path="/var/lib/kubelet/pods/886eb096-8aa3-423b-b611-03cc592de1d0/volumes" Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.517199 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"94bfcab311a4ee3e71d6c186d7867de80fdb4abfe47c1e419a433ca5d4a60238"} Mar 20 13:52:37 crc kubenswrapper[4755]: I0320 13:52:37.524622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.528250 4755 generic.go:334] "Generic (PLEG): container finished" podID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerID="56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" exitCode=137 Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.528316 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerDied","Data":"56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.529325 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b54ba84f-e5e3-48ba-b283-2c37348fef90","Type":"ContainerDied","Data":"ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.529342 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec205dafe86043e70d781baf18f8b1170a7b1ab37a63103271c795f73d1873fe" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532057 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532115 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.532139 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.593802 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.594041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.594093 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") pod \"b54ba84f-e5e3-48ba-b283-2c37348fef90\" (UID: \"b54ba84f-e5e3-48ba-b283-2c37348fef90\") " Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.598145 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn" (OuterVolumeSpecName: "kube-api-access-5pgcn") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "kube-api-access-5pgcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.621890 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.639929 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data" (OuterVolumeSpecName: "config-data") pod "b54ba84f-e5e3-48ba-b283-2c37348fef90" (UID: "b54ba84f-e5e3-48ba-b283-2c37348fef90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697515 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pgcn\" (UniqueName: \"kubernetes.io/projected/b54ba84f-e5e3-48ba-b283-2c37348fef90-kube-api-access-5pgcn\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697544 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:38 crc kubenswrapper[4755]: I0320 13:52:38.697555 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b54ba84f-e5e3-48ba-b283-2c37348fef90-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.004859 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.004915 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.547186 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.547194 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.573199 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.589272 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608009 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: E0320 13:52:39.608409 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608426 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.608674 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.609334 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.611446 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.611535 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.613771 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.632426 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717212 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717525 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717674 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.717842 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.718122 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.768709 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820215 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820293 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820338 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820358 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.820434 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.829799 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.841401 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.845328 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.851175 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/8318edf5-5648-4c19-8853-3d555435ed6f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.863215 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7sp\" (UniqueName: \"kubernetes.io/projected/8318edf5-5648-4c19-8853-3d555435ed6f-kube-api-access-bk7sp\") pod \"nova-cell1-novncproxy-0\" (UID: \"8318edf5-5648-4c19-8853-3d555435ed6f\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:39 crc kubenswrapper[4755]: I0320 13:52:39.930192 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:40 crc kubenswrapper[4755]: I0320 13:52:40.382899 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:52:40 crc kubenswrapper[4755]: I0320 13:52:40.556781 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8318edf5-5648-4c19-8853-3d555435ed6f","Type":"ContainerStarted","Data":"7f512918bbc22e31ca0c18f92be8801e701687c7309119e0bebcd5b0ee178fe2"} Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.026810 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.031563 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.035206 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.240121 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54ba84f-e5e3-48ba-b283-2c37348fef90" path="/var/lib/kubelet/pods/b54ba84f-e5e3-48ba-b283-2c37348fef90/volumes" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.570462 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"8318edf5-5648-4c19-8853-3d555435ed6f","Type":"ContainerStarted","Data":"8e5c202f3642f29bd6ba37127fc1deb176e98ed39a109baf7f1c56e5ccff9652"} Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.577216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.598371 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.598346093 podStartE2EDuration="2.598346093s" podCreationTimestamp="2026-03-20 13:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:41.587054561 +0000 UTC m=+1341.184987100" watchObservedRunningTime="2026-03-20 13:52:41.598346093 +0000 UTC m=+1341.196278632" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.905807 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.907306 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:41 crc kubenswrapper[4755]: I0320 13:52:41.916066 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053714 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053803 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.053948 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054225 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054322 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.054390 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155811 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155870 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155964 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.155997 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.156057 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.156160 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.158878 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.163742 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-config\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.163879 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.164229 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.169540 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/204ff403-3d73-430e-aa64-a41f033f641e-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.173014 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh629\" (UniqueName: \"kubernetes.io/projected/204ff403-3d73-430e-aa64-a41f033f641e-kube-api-access-xh629\") pod \"dnsmasq-dns-cd5cbd7b9-mkxft\" (UID: \"204ff403-3d73-430e-aa64-a41f033f641e\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.378512 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.597807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerStarted","Data":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.598071 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.625164 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.616080841 podStartE2EDuration="7.625146586s" podCreationTimestamp="2026-03-20 13:52:35 +0000 UTC" firstStartedPulling="2026-03-20 13:52:36.791337623 +0000 UTC m=+1336.389270152" lastFinishedPulling="2026-03-20 13:52:41.800403368 +0000 UTC m=+1341.398335897" observedRunningTime="2026-03-20 13:52:42.618304088 +0000 UTC m=+1342.216236637" watchObservedRunningTime="2026-03-20 13:52:42.625146586 +0000 UTC m=+1342.223079115" Mar 20 13:52:42 crc kubenswrapper[4755]: I0320 13:52:42.945886 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-mkxft"] Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613365 4755 generic.go:334] "Generic (PLEG): container finished" podID="204ff403-3d73-430e-aa64-a41f033f641e" containerID="12ccd499604429e6658a715e4e378949d8500574fbe5fddc12dbd0637665657f" exitCode=0 Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613454 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerDied","Data":"12ccd499604429e6658a715e4e378949d8500574fbe5fddc12dbd0637665657f"} Mar 20 13:52:43 crc kubenswrapper[4755]: I0320 13:52:43.613807 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerStarted","Data":"a95fceb205e141c263b018743de45ef8e3832fb1265ea1d6ba1bebf93404366a"} Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.287003 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.551839 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627314 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" event={"ID":"204ff403-3d73-430e-aa64-a41f033f641e","Type":"ContainerStarted","Data":"bf24d2988a5cea00c8fe4990e7a58673b96811d94ae34d804e06829ff9fc740c"} Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627450 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" containerID="cri-o://d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" gracePeriod=30 Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627530 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" containerID="cri-o://8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" gracePeriod=30 Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.627530 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.660013 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" podStartSLOduration=3.659995265 podStartE2EDuration="3.659995265s" podCreationTimestamp="2026-03-20 13:52:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:44.654284167 +0000 UTC m=+1344.252216696" watchObservedRunningTime="2026-03-20 13:52:44.659995265 +0000 UTC m=+1344.257927784" Mar 20 13:52:44 crc kubenswrapper[4755]: I0320 13:52:44.932047 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.636649 4755 generic.go:334] "Generic (PLEG): container finished" podID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" exitCode=143 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.636761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637351 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" containerID="cri-o://af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637385 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" containerID="cri-o://3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637436 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" containerID="cri-o://d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" gracePeriod=30 Mar 20 13:52:45 crc kubenswrapper[4755]: I0320 13:52:45.637512 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" containerID="cri-o://6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" gracePeriod=30 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.512331 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.648957 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.648992 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" exitCode=2 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649003 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649011 4755 generic.go:334] "Generic (PLEG): container finished" podID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" exitCode=0 Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649033 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649033 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649062 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649077 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649088 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649099 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4","Type":"ContainerDied","Data":"94bfcab311a4ee3e71d6c186d7867de80fdb4abfe47c1e419a433ca5d4a60238"} Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.649117 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.651905 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.651939 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652156 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652299 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652373 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652467 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652498 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.652528 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") pod \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\" (UID: \"ba8f64e1-de6f-46c5-9857-7916aaa7c9d4\") " Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653127 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653161 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653893 4755 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.653913 4755 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.658028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2" (OuterVolumeSpecName: "kube-api-access-mvdc2") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "kube-api-access-mvdc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.658623 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts" (OuterVolumeSpecName: "scripts") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.677991 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.693871 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.707000 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.723166 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.741792 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.754588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755546 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755581 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvdc2\" (UniqueName: \"kubernetes.io/projected/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-kube-api-access-mvdc2\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755594 4755 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755604 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.755614 4755 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773302 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.773794 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773835 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.773861 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774214 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774239 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774360 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774621 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774641 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774682 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: E0320 13:52:46.774890 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774909 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.774922 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775257 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775270 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775445 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775458 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775804 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.775822 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776010 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data" (OuterVolumeSpecName: "config-data") pod "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" (UID: "ba8f64e1-de6f-46c5-9857-7916aaa7c9d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776080 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776101 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776348 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776369 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776701 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776721 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776938 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.776958 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777374 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777429 4755 scope.go:117] "RemoveContainer" containerID="6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777733 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a"} err="failed to get container status \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": rpc error: code = NotFound desc = could not find container \"6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a\": container with ID starting with 6c2ef185e08512f079416873e901d19745412c9677e489b92dbe5f0560d7242a not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.777755 4755 scope.go:117] "RemoveContainer" containerID="3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.778007 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5"} err="failed to get container status \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": rpc error: code = NotFound desc = could not find container \"3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5\": container with ID starting with 3b5e37b0c6489986db0ca3715ccbeecf25ac76ff8e0aef68900319a62160bca5 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.778034 4755 scope.go:117] "RemoveContainer" containerID="d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.779850 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2"} err="failed to get container status \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": rpc error: code = NotFound desc = could not find container \"d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2\": container with ID starting with d7dbddda41d4c98eb5b449c914a7cec99458b7c68cf9b5bce02acce9fa8c0af2 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.779890 4755 scope.go:117] "RemoveContainer" containerID="af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.780183 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9"} err="failed to get container status \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": rpc error: code = NotFound desc = could not find container \"af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9\": container with ID starting with af6a1df59604010f20955d4dd322d672dd149a13a9eaeebd98f7e03a471711a9 not found: ID does not exist" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.857543 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.979773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:46 crc kubenswrapper[4755]: I0320 13:52:46.986681 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.012762 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013246 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013267 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013276 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013284 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013300 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013308 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: E0320 13:52:47.013332 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013339 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013504 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-notification-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013515 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="ceilometer-central-agent" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013524 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="sg-core" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.013545 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" containerName="proxy-httpd" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.016001 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019119 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019305 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.019530 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.027328 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162308 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162838 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.162979 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163260 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163396 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163495 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163626 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.163690 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.238963 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8f64e1-de6f-46c5-9857-7916aaa7c9d4" path="/var/lib/kubelet/pods/ba8f64e1-de6f-46c5-9857-7916aaa7c9d4/volumes" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265826 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265920 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.265946 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266014 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266042 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266089 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266144 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.266162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.267823 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-log-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.268150 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-run-httpd\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271562 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271698 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-config-data\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.271851 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.272579 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-scripts\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.274341 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.284679 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdk2z\" (UniqueName: \"kubernetes.io/projected/0c583579-b927-4ef7-bfc9-0c54a2e77bcb-kube-api-access-rdk2z\") pod \"ceilometer-0\" (UID: \"0c583579-b927-4ef7-bfc9-0c54a2e77bcb\") " pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.332495 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:52:47 crc kubenswrapper[4755]: I0320 13:52:47.838797 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:52:47 crc kubenswrapper[4755]: W0320 13:52:47.883046 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c583579_b927_4ef7_bfc9_0c54a2e77bcb.slice/crio-72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d WatchSource:0}: Error finding container 72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d: Status 404 returned error can't find the container with id 72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.245087 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391382 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391523 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391641 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.391726 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") pod \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\" (UID: \"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0\") " Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.392588 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs" (OuterVolumeSpecName: "logs") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.395708 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7" (OuterVolumeSpecName: "kube-api-access-fl9w7") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "kube-api-access-fl9w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.420684 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.433626 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data" (OuterVolumeSpecName: "config-data") pod "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" (UID: "8245d4eb-c2ea-4a18-ad12-3622eeaa48c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493448 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493484 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl9w7\" (UniqueName: \"kubernetes.io/projected/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-kube-api-access-fl9w7\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493494 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.493505 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670240 4755 generic.go:334] "Generic (PLEG): container finished" podID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" exitCode=0 Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670351 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"8245d4eb-c2ea-4a18-ad12-3622eeaa48c0","Type":"ContainerDied","Data":"69613a5a056ceb190560dd090764a72a502dd1b9d118cfd002538e2432b55f6e"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670410 4755 scope.go:117] "RemoveContainer" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.670415 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.673521 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"db7ea86e072e50ae870fae22c864dcd5d14312514d13b8d5b9f71b6ee5eb37c1"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.673566 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"72ffae20f04b34f6b2d50e4b941a8798db5c8a4d556638606ca11fbc1d20022d"} Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.721142 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.724547 4755 scope.go:117] "RemoveContainer" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.730529 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755215 4755 scope.go:117] "RemoveContainer" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.755599 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": container with ID starting with 8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46 not found: ID does not exist" containerID="8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755632 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46"} err="failed to get container status \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": rpc error: code = NotFound desc = could not find container \"8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46\": container with ID starting with 8221e6432b6ca1649fec7a4a7dc97886c8e5b998a6bfa781e482d6aa8bee9d46 not found: ID does not exist" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755668 4755 scope.go:117] "RemoveContainer" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.755930 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": container with ID starting with d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8 not found: ID does not exist" containerID="d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.755980 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8"} err="failed to get container status \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": rpc error: code = NotFound desc = could not find container \"d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8\": container with ID starting with d12512de7ee1bb85988241c9317fa75e93c645834d990a258259e50c5dcc5fc8 not found: ID does not exist" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757298 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.757772 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757793 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: E0320 13:52:48.757828 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.757836 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.758035 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-log" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.758056 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" containerName="nova-api-api" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.759272 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.762453 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.762676 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.764479 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.768411 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921184 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921334 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921366 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921438 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921489 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:48 crc kubenswrapper[4755]: I0320 13:52:48.921535 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023432 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023477 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023524 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023563 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023598 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.023687 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.027477 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.033891 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.034106 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.034224 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.035625 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.051298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"nova-api-0\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.128365 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.245498 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8245d4eb-c2ea-4a18-ad12-3622eeaa48c0" path="/var/lib/kubelet/pods/8245d4eb-c2ea-4a18-ad12-3622eeaa48c0/volumes" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.599120 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:49 crc kubenswrapper[4755]: W0320 13:52:49.611339 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb3effa2_e877_484a_8003_06a326a0b48b.slice/crio-8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f WatchSource:0}: Error finding container 8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f: Status 404 returned error can't find the container with id 8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.689773 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"2f273839bec72005f082a6cf6998a1a83b499301b33c48c31fd5b1ece7372f03"} Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.693670 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f"} Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.939216 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:49 crc kubenswrapper[4755]: I0320 13:52:49.983677 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.706725 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"92d5de14150800072f17f3008f4d24092d55c9c324814e9b3e5ef7f104145444"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.721280 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.721508 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerStarted","Data":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.743758 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.744448 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.744434192 podStartE2EDuration="2.744434192s" podCreationTimestamp="2026-03-20 13:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:50.740762517 +0000 UTC m=+1350.338695046" watchObservedRunningTime="2026-03-20 13:52:50.744434192 +0000 UTC m=+1350.342366741" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.951209 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.952739 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.955358 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.955550 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:52:50 crc kubenswrapper[4755]: I0320 13:52:50.970128 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083684 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083743 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083834 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.083861 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185024 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185162 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185190 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.185260 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195504 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.195597 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.205986 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"nova-cell1-cell-mapping-7fs4m\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.278370 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:51 crc kubenswrapper[4755]: I0320 13:52:51.734865 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 13:52:51 crc kubenswrapper[4755]: W0320 13:52:51.739884 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod557c5385_782c_410a_a371_b27f41d88a47.slice/crio-4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd WatchSource:0}: Error finding container 4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd: Status 404 returned error can't find the container with id 4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.381850 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-mkxft" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.465204 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.465435 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-726q6" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" containerID="cri-o://62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" gracePeriod=10 Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.742998 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0c583579-b927-4ef7-bfc9-0c54a2e77bcb","Type":"ContainerStarted","Data":"2350a80dfcdc41ddafcce0834f07da754b792f1d0e614e76d528ca12b6dc8def"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.744768 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.779188 4755 generic.go:334] "Generic (PLEG): container finished" podID="24dab838-4670-45f3-8276-240f4266194d" containerID="62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" exitCode=0 Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.779264 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.789695 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerStarted","Data":"93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.789729 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerStarted","Data":"4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd"} Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.792237 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.579152904 podStartE2EDuration="6.792208396s" podCreationTimestamp="2026-03-20 13:52:46 +0000 UTC" firstStartedPulling="2026-03-20 13:52:47.888288864 +0000 UTC m=+1347.486221393" lastFinishedPulling="2026-03-20 13:52:52.101344356 +0000 UTC m=+1351.699276885" observedRunningTime="2026-03-20 13:52:52.769050036 +0000 UTC m=+1352.366982565" watchObservedRunningTime="2026-03-20 13:52:52.792208396 +0000 UTC m=+1352.390140925" Mar 20 13:52:52 crc kubenswrapper[4755]: I0320 13:52:52.813422 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7fs4m" podStartSLOduration=2.813407316 podStartE2EDuration="2.813407316s" podCreationTimestamp="2026-03-20 13:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:52:52.807936094 +0000 UTC m=+1352.405868623" watchObservedRunningTime="2026-03-20 13:52:52.813407316 +0000 UTC m=+1352.411339845" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.024233 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.130348 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131375 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131569 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131677 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131825 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.131937 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") pod \"24dab838-4670-45f3-8276-240f4266194d\" (UID: \"24dab838-4670-45f3-8276-240f4266194d\") " Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.179032 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc" (OuterVolumeSpecName: "kube-api-access-75cnc") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "kube-api-access-75cnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.187552 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config" (OuterVolumeSpecName: "config") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.204247 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.206940 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.208098 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.214208 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "24dab838-4670-45f3-8276-240f4266194d" (UID: "24dab838-4670-45f3-8276-240f4266194d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235519 4755 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235549 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235558 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75cnc\" (UniqueName: \"kubernetes.io/projected/24dab838-4670-45f3-8276-240f4266194d-kube-api-access-75cnc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235570 4755 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235579 4755 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.235588 4755 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/24dab838-4670-45f3-8276-240f4266194d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797372 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-726q6" event={"ID":"24dab838-4670-45f3-8276-240f4266194d","Type":"ContainerDied","Data":"a715012945c056578e5ea9afdce0e862b1b605e848af6cc18ee8c8ac02d5a1c9"} Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797438 4755 scope.go:117] "RemoveContainer" containerID="62f42b6a85158338dc75505a16a105269d7004c1c9843ad73ccf52fb4264882d" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.797728 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-726q6" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.857294 4755 scope.go:117] "RemoveContainer" containerID="1acd81a61329d7b9a26f38cb792eb39488a3ccf7a0bcc5d4334568c772df3f16" Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.877545 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:53 crc kubenswrapper[4755]: I0320 13:52:53.889637 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-726q6"] Mar 20 13:52:55 crc kubenswrapper[4755]: I0320 13:52:55.251452 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24dab838-4670-45f3-8276-240f4266194d" path="/var/lib/kubelet/pods/24dab838-4670-45f3-8276-240f4266194d/volumes" Mar 20 13:52:56 crc kubenswrapper[4755]: I0320 13:52:56.849848 4755 generic.go:334] "Generic (PLEG): container finished" podID="557c5385-782c-410a-a371-b27f41d88a47" containerID="93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa" exitCode=0 Mar 20 13:52:56 crc kubenswrapper[4755]: I0320 13:52:56.849887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerDied","Data":"93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa"} Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.256583 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389102 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389363 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389921 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.389973 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") pod \"557c5385-782c-410a-a371-b27f41d88a47\" (UID: \"557c5385-782c-410a-a371-b27f41d88a47\") " Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.397106 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4" (OuterVolumeSpecName: "kube-api-access-h27h4") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "kube-api-access-h27h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.397152 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts" (OuterVolumeSpecName: "scripts") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.427028 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.435215 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data" (OuterVolumeSpecName: "config-data") pod "557c5385-782c-410a-a371-b27f41d88a47" (UID: "557c5385-782c-410a-a371-b27f41d88a47"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493220 4755 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493260 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493273 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/557c5385-782c-410a-a371-b27f41d88a47-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.493288 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h27h4\" (UniqueName: \"kubernetes.io/projected/557c5385-782c-410a-a371-b27f41d88a47-kube-api-access-h27h4\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.878870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7fs4m" event={"ID":"557c5385-782c-410a-a371-b27f41d88a47","Type":"ContainerDied","Data":"4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd"} Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.879251 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4812a75d2c6eb61244177e0d0b8c5430015f2de96e08b7364143ce28989e3cbd" Mar 20 13:52:58 crc kubenswrapper[4755]: I0320 13:52:58.878970 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7fs4m" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.112396 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.112792 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" containerID="cri-o://e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.113489 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" containerID="cri-o://95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.119832 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.120091 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" containerID="cri-o://1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.293463 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.293897 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" containerID="cri-o://b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.294028 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" containerID="cri-o://e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" gracePeriod=30 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.747565 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832142 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832212 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832293 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832345 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832400 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.832432 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") pod \"fb3effa2-e877-484a-8003-06a326a0b48b\" (UID: \"fb3effa2-e877-484a-8003-06a326a0b48b\") " Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.834186 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs" (OuterVolumeSpecName: "logs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.839081 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v" (OuterVolumeSpecName: "kube-api-access-8z59v") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "kube-api-access-8z59v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.868547 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.885313 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data" (OuterVolumeSpecName: "config-data") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.900950 4755 generic.go:334] "Generic (PLEG): container finished" podID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerID="b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" exitCode=143 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.901040 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904023 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb3effa2-e877-484a-8003-06a326a0b48b" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" exitCode=0 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904077 4755 generic.go:334] "Generic (PLEG): container finished" podID="fb3effa2-e877-484a-8003-06a326a0b48b" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" exitCode=143 Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904112 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904165 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904163 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904200 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.904183 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fb3effa2-e877-484a-8003-06a326a0b48b","Type":"ContainerDied","Data":"8e9494add45ad54f7f290a586572e26e32ab31039ce6e01e23c4e147c0955e3f"} Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.921332 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.923350 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fb3effa2-e877-484a-8003-06a326a0b48b" (UID: "fb3effa2-e877-484a-8003-06a326a0b48b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.936565 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939631 4755 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939703 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z59v\" (UniqueName: \"kubernetes.io/projected/fb3effa2-e877-484a-8003-06a326a0b48b-kube-api-access-8z59v\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939714 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb3effa2-e877-484a-8003-06a326a0b48b-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939725 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939733 4755 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.939743 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb3effa2-e877-484a-8003-06a326a0b48b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.967508 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: E0320 13:52:59.968787 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.968832 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} err="failed to get container status \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.968864 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: E0320 13:52:59.969636 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.969715 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} err="failed to get container status \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970256 4755 scope.go:117] "RemoveContainer" containerID="95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970682 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e"} err="failed to get container status \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": rpc error: code = NotFound desc = could not find container \"95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e\": container with ID starting with 95cd9e75dad57e7616324f865e213c9cbc377b653d6777e9eabd03afcc65a88e not found: ID does not exist" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970704 4755 scope.go:117] "RemoveContainer" containerID="e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd" Mar 20 13:52:59 crc kubenswrapper[4755]: I0320 13:52:59.970934 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd"} err="failed to get container status \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": rpc error: code = NotFound desc = could not find container \"e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd\": container with ID starting with e6f1029a9620dcea874ca2949774c226d0089d8b66d63fd7eaba9d0d1808fcfd not found: ID does not exist" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.239734 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.247417 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.267273 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268013 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268118 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268215 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="init" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268291 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="init" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268362 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268429 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268506 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268567 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.268676 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.268743 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269021 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="24dab838-4670-45f3-8276-240f4266194d" containerName="dnsmasq-dns" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269131 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="557c5385-782c-410a-a371-b27f41d88a47" containerName="nova-manage" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269213 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-api" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.269286 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" containerName="nova-api-log" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.270578 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275550 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275649 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.275866 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.291053 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347158 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347218 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347237 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347292 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347348 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.347383 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449520 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449591 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449651 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449704 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449723 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.449759 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.450844 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-logs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.456056 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-internal-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.457968 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-config-data\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.459030 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-public-tls-certs\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.459866 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.467527 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqt5x\" (UniqueName: \"kubernetes.io/projected/b975ad31-5e47-43b2-a0c6-4d1ee9e50006-kube-api-access-bqt5x\") pod \"nova-api-0\" (UID: \"b975ad31-5e47-43b2-a0c6-4d1ee9e50006\") " pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: I0320 13:53:00.647667 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.687179 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.688695 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.690390 4755 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:53:00 crc kubenswrapper[4755]: E0320 13:53:00.690449 4755 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.120351 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.234798 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb3effa2-e877-484a-8003-06a326a0b48b" path="/var/lib/kubelet/pods/fb3effa2-e877-484a-8003-06a326a0b48b/volumes" Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929001 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"0811e02ac39c52724b470230398e95a710c8071a3acffc5ce3ca53256fcea5c9"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929634 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"f9bd3e6c72d05a8c7eacb7ea7628f8d0cff791fc85e2282eb788e0d4df903a0e"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.929645 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b975ad31-5e47-43b2-a0c6-4d1ee9e50006","Type":"ContainerStarted","Data":"2a7e0970fbf965bb9ff6dc1fc817b8cc4ed04376556a205e8e23810c41f0146c"} Mar 20 13:53:01 crc kubenswrapper[4755]: I0320 13:53:01.950179 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.9501637139999999 podStartE2EDuration="1.950163714s" podCreationTimestamp="2026-03-20 13:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:01.948766928 +0000 UTC m=+1361.546699477" watchObservedRunningTime="2026-03-20 13:53:01.950163714 +0000 UTC m=+1361.548096243" Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.937925 4755 generic.go:334] "Generic (PLEG): container finished" podID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerID="e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" exitCode=0 Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938012 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b"} Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938376 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7","Type":"ContainerDied","Data":"b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab"} Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.938393 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2133d8e1db74457eb9788c3a82dfef46ce6f3e0827fdf1e17b8955f5ddc6aab" Mar 20 13:53:02 crc kubenswrapper[4755]: I0320 13:53:02.956213 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106418 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106571 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.106891 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107041 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107632 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs" (OuterVolumeSpecName: "logs") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.107796 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") pod \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\" (UID: \"dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7\") " Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.108471 4755 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.112124 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc" (OuterVolumeSpecName: "kube-api-access-wnvlc") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "kube-api-access-wnvlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.134793 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data" (OuterVolumeSpecName: "config-data") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.142118 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.162083 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" (UID: "dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210299 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210341 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnvlc\" (UniqueName: \"kubernetes.io/projected/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-kube-api-access-wnvlc\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210361 4755 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.210372 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.947459 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.973143 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:03 crc kubenswrapper[4755]: I0320 13:53:03.982847 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.023819 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: E0320 13:53:04.024295 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024321 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: E0320 13:53:04.024350 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024358 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024596 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-log" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.024621 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" containerName="nova-metadata-metadata" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.025892 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.029281 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.030106 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.056209 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126041 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126121 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126330 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126562 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.126760 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228268 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228333 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228482 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228525 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.228563 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.229320 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80d8fd29-d89f-4955-86f3-a8137400c67b-logs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.233283 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.242350 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.242486 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80d8fd29-d89f-4955-86f3-a8137400c67b-config-data\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.258010 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v599m\" (UniqueName: \"kubernetes.io/projected/80d8fd29-d89f-4955-86f3-a8137400c67b-kube-api-access-v599m\") pod \"nova-metadata-0\" (UID: \"80d8fd29-d89f-4955-86f3-a8137400c67b\") " pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.379003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.880345 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.886032 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:04 crc kubenswrapper[4755]: W0320 13:53:04.889706 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80d8fd29_d89f_4955_86f3_a8137400c67b.slice/crio-3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07 WatchSource:0}: Error finding container 3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07: Status 404 returned error can't find the container with id 3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07 Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987347 4755 generic.go:334] "Generic (PLEG): container finished" podID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" exitCode=0 Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987406 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerDied","Data":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987432 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"9b22e0c3-341e-444d-a615-50d5ccdc9f12","Type":"ContainerDied","Data":"0832f620042cff12a63e4f562749394b016950c8664c3e0fdafe5f529ec50e82"} Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987448 4755 scope.go:117] "RemoveContainer" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.987569 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:04 crc kubenswrapper[4755]: I0320 13:53:04.991816 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"3130101d0fc32fdbdf4bb9604d3d120db93ce57fafeb578ab304e783a2af3c07"} Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.016433 4755 scope.go:117] "RemoveContainer" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:05 crc kubenswrapper[4755]: E0320 13:53:05.020136 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": container with ID starting with 1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda not found: ID does not exist" containerID="1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.020195 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda"} err="failed to get container status \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": rpc error: code = NotFound desc = could not find container \"1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda\": container with ID starting with 1ba28ce4c3e25c259f12618a2dc0d278035a581fe24caeee1a759a253e90ceda not found: ID does not exist" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.050393 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.051437 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.051643 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") pod \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\" (UID: \"9b22e0c3-341e-444d-a615-50d5ccdc9f12\") " Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.058368 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92" (OuterVolumeSpecName: "kube-api-access-f5r92") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "kube-api-access-f5r92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.124886 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.125039 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data" (OuterVolumeSpecName: "config-data") pod "9b22e0c3-341e-444d-a615-50d5ccdc9f12" (UID: "9b22e0c3-341e-444d-a615-50d5ccdc9f12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156495 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156524 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5r92\" (UniqueName: \"kubernetes.io/projected/9b22e0c3-341e-444d-a615-50d5ccdc9f12-kube-api-access-f5r92\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.156537 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b22e0c3-341e-444d-a615-50d5ccdc9f12-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.236348 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7" path="/var/lib/kubelet/pods/dc55c1b8-6ed7-41ba-b5a6-8fe3f03fe3c7/volumes" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.428773 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.454858 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.474203 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: E0320 13:53:05.474840 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.474864 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.475142 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" containerName="nova-scheduler-scheduler" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.475925 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.479713 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.487110 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.566569 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.567111 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.567151 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668336 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.668526 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.680641 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.683940 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15175c96-bbe4-4a56-be68-a5db33909e54-config-data\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.686219 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b88bs\" (UniqueName: \"kubernetes.io/projected/15175c96-bbe4-4a56-be68-a5db33909e54-kube-api-access-b88bs\") pod \"nova-scheduler-0\" (UID: \"15175c96-bbe4-4a56-be68-a5db33909e54\") " pod="openstack/nova-scheduler-0" Mar 20 13:53:05 crc kubenswrapper[4755]: I0320 13:53:05.804003 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.008719 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"946bc1cbbf0bb5b4a2df740e13ae2bd9f5f09a66f23558abd1908cbc75084b24"} Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.008768 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"80d8fd29-d89f-4955-86f3-a8137400c67b","Type":"ContainerStarted","Data":"bc30af876b4ad2f08935a647d697807fdf037fa904c623faed14f2c09838428c"} Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.030067 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.030045725 podStartE2EDuration="3.030045725s" podCreationTimestamp="2026-03-20 13:53:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:06.028595498 +0000 UTC m=+1365.626528027" watchObservedRunningTime="2026-03-20 13:53:06.030045725 +0000 UTC m=+1365.627978254" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.242004 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:53:06 crc kubenswrapper[4755]: W0320 13:53:06.250048 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15175c96_bbe4_4a56_be68_a5db33909e54.slice/crio-c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6 WatchSource:0}: Error finding container c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6: Status 404 returned error can't find the container with id c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6 Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.751737 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752108 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752150 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752934 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:53:06 crc kubenswrapper[4755]: I0320 13:53:06.752992 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" gracePeriod=600 Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.026882 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15175c96-bbe4-4a56-be68-a5db33909e54","Type":"ContainerStarted","Data":"5a5a909caf71fd20e500db1f49d69a478f176d61164c67eb5c07dc619b5f6be3"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.026947 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"15175c96-bbe4-4a56-be68-a5db33909e54","Type":"ContainerStarted","Data":"c2bd40a41d21bd096a3bc83023cfd89e6ea411525e5915b437fbb6ccfdb3cde6"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030339 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" exitCode=0 Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030388 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74"} Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.030445 4755 scope.go:117] "RemoveContainer" containerID="4e38c59c77bbb81bbe9f02be9529cd72407390b3de58e15a37f2f1280b01b773" Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.056167 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.056150861 podStartE2EDuration="2.056150861s" podCreationTimestamp="2026-03-20 13:53:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:07.049097377 +0000 UTC m=+1366.647029906" watchObservedRunningTime="2026-03-20 13:53:07.056150861 +0000 UTC m=+1366.654083390" Mar 20 13:53:07 crc kubenswrapper[4755]: I0320 13:53:07.238703 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b22e0c3-341e-444d-a615-50d5ccdc9f12" path="/var/lib/kubelet/pods/9b22e0c3-341e-444d-a615-50d5ccdc9f12/volumes" Mar 20 13:53:08 crc kubenswrapper[4755]: I0320 13:53:08.044533 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.648323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.649136 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:53:10 crc kubenswrapper[4755]: I0320 13:53:10.804323 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:53:11 crc kubenswrapper[4755]: I0320 13:53:11.661891 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b975ad31-5e47-43b2-a0c6-4d1ee9e50006" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:11 crc kubenswrapper[4755]: I0320 13:53:11.662167 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b975ad31-5e47-43b2-a0c6-4d1ee9e50006" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:14 crc kubenswrapper[4755]: I0320 13:53:14.377639 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:53:14 crc kubenswrapper[4755]: I0320 13:53:14.379495 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.423820 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80d8fd29-d89f-4955-86f3-a8137400c67b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.423909 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="80d8fd29-d89f-4955-86f3-a8137400c67b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.804726 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:53:15 crc kubenswrapper[4755]: I0320 13:53:15.840222 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:53:16 crc kubenswrapper[4755]: I0320 13:53:16.180334 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:53:17 crc kubenswrapper[4755]: I0320 13:53:17.342020 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:53:18 crc kubenswrapper[4755]: I0320 13:53:18.647903 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:53:18 crc kubenswrapper[4755]: I0320 13:53:18.648286 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.659015 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.661142 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:53:20 crc kubenswrapper[4755]: I0320 13:53:20.671269 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:53:21 crc kubenswrapper[4755]: I0320 13:53:21.208544 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:53:22 crc kubenswrapper[4755]: I0320 13:53:22.380173 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:53:22 crc kubenswrapper[4755]: I0320 13:53:22.381374 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.387793 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.389603 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:53:24 crc kubenswrapper[4755]: I0320 13:53:24.397066 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:53:25 crc kubenswrapper[4755]: I0320 13:53:25.269346 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:53:35 crc kubenswrapper[4755]: I0320 13:53:35.368453 4755 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod9b22e0c3-341e-444d-a615-50d5ccdc9f12"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod9b22e0c3-341e-444d-a615-50d5ccdc9f12] : Timed out while waiting for systemd to remove kubepods-besteffort-pod9b22e0c3_341e_444d_a615_50d5ccdc9f12.slice" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.165698 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.168123 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.170398 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.171703 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.171927 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.198296 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.229996 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.333743 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.384399 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"auto-csr-approver-29566914-xfmxl\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:00 crc kubenswrapper[4755]: I0320 13:54:00.511610 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:01 crc kubenswrapper[4755]: I0320 13:54:01.025343 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 13:54:01 crc kubenswrapper[4755]: I0320 13:54:01.681307 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerStarted","Data":"3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c"} Mar 20 13:54:02 crc kubenswrapper[4755]: I0320 13:54:02.694778 4755 generic.go:334] "Generic (PLEG): container finished" podID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerID="e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c" exitCode=0 Mar 20 13:54:02 crc kubenswrapper[4755]: I0320 13:54:02.694861 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerDied","Data":"e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c"} Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.113673 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.227763 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") pod \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\" (UID: \"ea7c11fe-b29d-4fa4-a46d-7079105e883e\") " Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.239811 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq" (OuterVolumeSpecName: "kube-api-access-8z9vq") pod "ea7c11fe-b29d-4fa4-a46d-7079105e883e" (UID: "ea7c11fe-b29d-4fa4-a46d-7079105e883e"). InnerVolumeSpecName "kube-api-access-8z9vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.330642 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z9vq\" (UniqueName: \"kubernetes.io/projected/ea7c11fe-b29d-4fa4-a46d-7079105e883e-kube-api-access-8z9vq\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717419 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" event={"ID":"ea7c11fe-b29d-4fa4-a46d-7079105e883e","Type":"ContainerDied","Data":"3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c"} Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717468 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3707cc3f15982118da5f921713ade44647be0ba8beaf31b7fcc5a1b76df5334c" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.717480 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-xfmxl" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.911070 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:04 crc kubenswrapper[4755]: E0320 13:54:04.911918 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.911942 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.912156 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" containerName="oc" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.913322 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.915621 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6z9hz"/"openshift-service-ca.crt" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.919645 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6z9hz"/"kube-root-ca.crt" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.919694 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6z9hz"/"default-dockercfg-4thtd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943217 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943568 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:04 crc kubenswrapper[4755]: I0320 13:54:04.943767 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049068 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049169 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.049740 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.103522 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"must-gather-2mgxd\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.218511 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.234921 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-dmw6j"] Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.249014 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:54:05 crc kubenswrapper[4755]: I0320 13:54:05.734752 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:54:05 crc kubenswrapper[4755]: W0320 13:54:05.737135 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e7e4d4d_749a_4ec8_89f4_1362f7787e43.slice/crio-e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2 WatchSource:0}: Error finding container e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2: Status 404 returned error can't find the container with id e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2 Mar 20 13:54:06 crc kubenswrapper[4755]: I0320 13:54:06.747531 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"e84d0ee1e43f55669d5fce50ac1081dc81d43aa6534a31f45c0fae04284aa6c2"} Mar 20 13:54:07 crc kubenswrapper[4755]: I0320 13:54:07.238794 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a434c164-9ea6-4062-b8f6-88bb58f41a64" path="/var/lib/kubelet/pods/a434c164-9ea6-4062-b8f6-88bb58f41a64/volumes" Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.788029 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4"} Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.788704 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerStarted","Data":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} Mar 20 13:54:10 crc kubenswrapper[4755]: I0320 13:54:10.806219 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" podStartSLOduration=2.625588198 podStartE2EDuration="6.806206114s" podCreationTimestamp="2026-03-20 13:54:04 +0000 UTC" firstStartedPulling="2026-03-20 13:54:05.738759487 +0000 UTC m=+1425.336692016" lastFinishedPulling="2026-03-20 13:54:09.919377393 +0000 UTC m=+1429.517309932" observedRunningTime="2026-03-20 13:54:10.801200172 +0000 UTC m=+1430.399132701" watchObservedRunningTime="2026-03-20 13:54:10.806206114 +0000 UTC m=+1430.404138643" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.236440 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.238152 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.365587 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.365703 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.466876 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.467243 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.467445 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.492262 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"crc-debug-kfkm2\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.560119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:15 crc kubenswrapper[4755]: I0320 13:54:15.855173 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerStarted","Data":"70470e286c3fd7782bb98a9a790ceaee1c9055dfab3df7a5943ee2120f5a6b70"} Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.639996 4755 scope.go:117] "RemoveContainer" containerID="52d80a295f203def80f45f1a56a14d0c5758de39ba1147d6937ffde8c9d85ad7" Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.967229 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerStarted","Data":"12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2"} Mar 20 13:54:27 crc kubenswrapper[4755]: I0320 13:54:27.984125 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" podStartSLOduration=1.628718084 podStartE2EDuration="12.98410567s" podCreationTimestamp="2026-03-20 13:54:15 +0000 UTC" firstStartedPulling="2026-03-20 13:54:15.607665691 +0000 UTC m=+1435.205598220" lastFinishedPulling="2026-03-20 13:54:26.963053277 +0000 UTC m=+1446.560985806" observedRunningTime="2026-03-20 13:54:27.981332137 +0000 UTC m=+1447.579264706" watchObservedRunningTime="2026-03-20 13:54:27.98410567 +0000 UTC m=+1447.582038209" Mar 20 13:54:46 crc kubenswrapper[4755]: I0320 13:54:46.140061 4755 generic.go:334] "Generic (PLEG): container finished" podID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerID="12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2" exitCode=0 Mar 20 13:54:46 crc kubenswrapper[4755]: I0320 13:54:46.140143 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" event={"ID":"1b539e13-5082-42e6-ac4d-a3f8fa788244","Type":"ContainerDied","Data":"12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2"} Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.257282 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.312585 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.330759 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-kfkm2"] Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379061 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") pod \"1b539e13-5082-42e6-ac4d-a3f8fa788244\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379120 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") pod \"1b539e13-5082-42e6-ac4d-a3f8fa788244\" (UID: \"1b539e13-5082-42e6-ac4d-a3f8fa788244\") " Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379204 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host" (OuterVolumeSpecName: "host") pod "1b539e13-5082-42e6-ac4d-a3f8fa788244" (UID: "1b539e13-5082-42e6-ac4d-a3f8fa788244"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.379703 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1b539e13-5082-42e6-ac4d-a3f8fa788244-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.385404 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn" (OuterVolumeSpecName: "kube-api-access-6ckgn") pod "1b539e13-5082-42e6-ac4d-a3f8fa788244" (UID: "1b539e13-5082-42e6-ac4d-a3f8fa788244"). InnerVolumeSpecName "kube-api-access-6ckgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:47 crc kubenswrapper[4755]: I0320 13:54:47.480943 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ckgn\" (UniqueName: \"kubernetes.io/projected/1b539e13-5082-42e6-ac4d-a3f8fa788244-kube-api-access-6ckgn\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.160226 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70470e286c3fd7782bb98a9a790ceaee1c9055dfab3df7a5943ee2120f5a6b70" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.160344 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-kfkm2" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.494471 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:48 crc kubenswrapper[4755]: E0320 13:54:48.494838 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.494850 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.495045 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" containerName="container-00" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.495592 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.600336 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.600751 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703283 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703364 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.703466 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.733750 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"crc-debug-t5h2c\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: I0320 13:54:48.812416 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:48 crc kubenswrapper[4755]: W0320 13:54:48.851759 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5df38c8e_0ed7_4de9_a7ef_b95a385aee6e.slice/crio-5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325 WatchSource:0}: Error finding container 5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325: Status 404 returned error can't find the container with id 5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325 Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.169678 4755 generic.go:334] "Generic (PLEG): container finished" podID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerID="6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3" exitCode=1 Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.169772 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" event={"ID":"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e","Type":"ContainerDied","Data":"6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3"} Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.170050 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" event={"ID":"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e","Type":"ContainerStarted","Data":"5cd05ca572ed672f1c0c250ac6fdc953d7f9fcca167512de5493baa60ce3a325"} Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.214258 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.222306 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/crc-debug-t5h2c"] Mar 20 13:54:49 crc kubenswrapper[4755]: I0320 13:54:49.236587 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b539e13-5082-42e6-ac4d-a3f8fa788244" path="/var/lib/kubelet/pods/1b539e13-5082-42e6-ac4d-a3f8fa788244/volumes" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.283404 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.331815 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") pod \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.331879 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host" (OuterVolumeSpecName: "host") pod "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" (UID: "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.332256 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") pod \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\" (UID: \"5df38c8e-0ed7-4de9-a7ef-b95a385aee6e\") " Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.332744 4755 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-host\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.347347 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw" (OuterVolumeSpecName: "kube-api-access-889lw") pod "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" (UID: "5df38c8e-0ed7-4de9-a7ef-b95a385aee6e"). InnerVolumeSpecName "kube-api-access-889lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:50 crc kubenswrapper[4755]: I0320 13:54:50.434056 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-889lw\" (UniqueName: \"kubernetes.io/projected/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e-kube-api-access-889lw\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.190574 4755 scope.go:117] "RemoveContainer" containerID="6f88fa467b2aa84d0afc1eca44399e2bd3ea007654231eeee016bc077a3831b3" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.190708 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/crc-debug-t5h2c" Mar 20 13:54:51 crc kubenswrapper[4755]: I0320 13:54:51.241371 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" path="/var/lib/kubelet/pods/5df38c8e-0ed7-4de9-a7ef-b95a385aee6e/volumes" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.760573 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:08 crc kubenswrapper[4755]: E0320 13:55:08.761800 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.761817 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.762064 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5df38c8e-0ed7-4de9-a7ef-b95a385aee6e" containerName="container-00" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.763864 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.763969 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.808939 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.809016 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.809306 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910871 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.910967 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.911413 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.911483 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:08 crc kubenswrapper[4755]: I0320 13:55:08.942185 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"redhat-operators-k6d2r\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:09 crc kubenswrapper[4755]: I0320 13:55:09.091577 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:09 crc kubenswrapper[4755]: I0320 13:55:09.577910 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.389722 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" exitCode=0 Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.389870 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10"} Mar 20 13:55:10 crc kubenswrapper[4755]: I0320 13:55:10.390178 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"52b02238107f8ca4e75a6ad385761dee4f2cd32405f08f6c4064824ba74ac65f"} Mar 20 13:55:13 crc kubenswrapper[4755]: I0320 13:55:13.423509 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} Mar 20 13:55:14 crc kubenswrapper[4755]: I0320 13:55:14.441316 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" exitCode=0 Mar 20 13:55:14 crc kubenswrapper[4755]: I0320 13:55:14.441512 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} Mar 20 13:55:15 crc kubenswrapper[4755]: I0320 13:55:15.466761 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerStarted","Data":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} Mar 20 13:55:15 crc kubenswrapper[4755]: I0320 13:55:15.493090 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k6d2r" podStartSLOduration=2.783330028 podStartE2EDuration="7.493052688s" podCreationTimestamp="2026-03-20 13:55:08 +0000 UTC" firstStartedPulling="2026-03-20 13:55:10.391854328 +0000 UTC m=+1489.989786857" lastFinishedPulling="2026-03-20 13:55:15.101576988 +0000 UTC m=+1494.699509517" observedRunningTime="2026-03-20 13:55:15.48632175 +0000 UTC m=+1495.084254279" watchObservedRunningTime="2026-03-20 13:55:15.493052688 +0000 UTC m=+1495.090985267" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.091972 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.093261 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.452885 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-1376-account-create-update-jhbhp_015c8ae7-1856-4b0c-b5ce-e2503a2080dc/mariadb-account-create-update/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.568070 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7769db74db-f4kfh_45fa2a85-b7d9-413c-827c-fdcbcec05faf/barbican-api/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.653767 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7769db74db-f4kfh_45fa2a85-b7d9-413c-827c-fdcbcec05faf/barbican-api-log/0.log" Mar 20 13:55:19 crc kubenswrapper[4755]: I0320 13:55:19.859412 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-create-jm9nr_feb55e83-711d-4561-8b57-2a231944e1b1/mariadb-database-create/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.119439 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-db-sync-dtggj_95c76f8c-7b76-4714-adac-6297b84d6492/barbican-db-sync/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.151282 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:20 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:20 crc kubenswrapper[4755]: > Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.274109 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-cbc45f8f6-z2sx8_55a78d73-f853-49d7-99b2-81c25ea6bb20/barbican-keystone-listener/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.373507 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-cbc45f8f6-z2sx8_55a78d73-f853-49d7-99b2-81c25ea6bb20/barbican-keystone-listener-log/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.428449 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b9dc5449-j62ns_d2108220-35b4-45b7-a2bc-e93138394ff0/barbican-worker/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.528377 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-56b9dc5449-j62ns_d2108220-35b4-45b7-a2bc-e93138394ff0/barbican-worker-log/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.735893 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/ceilometer-central-agent/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.744527 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/ceilometer-notification-agent/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.755499 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/proxy-httpd/0.log" Mar 20 13:55:20 crc kubenswrapper[4755]: I0320 13:55:20.923095 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0c583579-b927-4ef7-bfc9-0c54a2e77bcb/sg-core/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.030955 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-9528-account-create-update-6xkmx_5dde547e-5fce-4868-ba0e-63650ea0c771/mariadb-account-create-update/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.103802 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9894c7cb-7899-4354-a6c2-e7339eb1f765/cinder-api/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.195672 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_9894c7cb-7899-4354-a6c2-e7339eb1f765/cinder-api-log/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.302361 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-2jwbt_8c5d05dc-a589-4d2e-9374-0d57202a3cfc/mariadb-database-create/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.384152 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-sync-jrf8c_25bd1da4-7fdb-4bd9-8405-a37fc6c18be0/cinder-db-sync/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.519860 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df39e954-98b1-4c7c-bc51-5c2ee4db8a6d/cinder-scheduler/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.549620 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_df39e954-98b1-4c7c-bc51-5c2ee4db8a6d/probe/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.694325 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/init/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.833379 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/dnsmasq-dns/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.869197 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cd5cbd7b9-mkxft_204ff403-3d73-430e-aa64-a41f033f641e/init/0.log" Mar 20 13:55:21 crc kubenswrapper[4755]: I0320 13:55:21.924176 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-35fe-account-create-update-h6fl8_46d041c2-e231-49fd-9d88-a991a1b9dd65/mariadb-account-create-update/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.074296 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-pg2bq_6fe77db3-29ef-42ae-840b-9736f07188ca/mariadb-database-create/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.144247 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-w78rr_3047e6fe-5128-4361-bede-e9f0c4e9387c/glance-db-sync/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.327207 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e65d1645-8a19-459e-ac89-b485f27e2841/glance-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.343908 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_e65d1645-8a19-459e-ac89-b485f27e2841/glance-httpd/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.533761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6b182ae3-20c9-48af-9313-d48a608924b1/glance-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.550639 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_6b182ae3-20c9-48af-9313-d48a608924b1/glance-httpd/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.673690 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9f7d4c74d-t7tpq_2af5836e-8c76-4432-95c0-ef34d6fc3528/horizon/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.796064 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-9f7d4c74d-t7tpq_2af5836e-8c76-4432-95c0-ef34d6fc3528/horizon-log/0.log" Mar 20 13:55:22 crc kubenswrapper[4755]: I0320 13:55:22.875992 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8e2f-account-create-update-cvvh2_79c00857-0d6a-4c12-8581-da16e2a24f04/mariadb-account-create-update/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.036104 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-8f554bbf4-zvxzv_ab9d92e7-deba-4bdd-a267-e35fd5ec2f23/keystone-api/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.123530 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-rwsvb_5dddb768-c318-44b8-bac9-ea26f29ca038/keystone-bootstrap/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.213496 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-g2hvs_0795b626-b382-4b9b-beb5-802cebc4f764/mariadb-database-create/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.287028 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-9xrbx_64ad8e64-0606-4171-bd2d-ae8212fdff8f/keystone-db-sync/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.353282 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_4f27a8a2-0755-47ae-a7b4-63787c8c9393/kube-state-metrics/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.567164 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754b98cbff-jgntp_0263cee7-e9d5-48ff-8326-7455a95311a6/neutron-httpd/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.630205 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-754b98cbff-jgntp_0263cee7-e9d5-48ff-8326-7455a95311a6/neutron-api/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.719722 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-hm9qz_e38d31ac-eae6-4cd1-be04-304215db852a/mariadb-database-create/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.874186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-52m67_69707be4-e338-4e13-8ecc-8cfd7cd416b2/neutron-db-sync/0.log" Mar 20 13:55:23 crc kubenswrapper[4755]: I0320 13:55:23.943533 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-fc04-account-create-update-x9t57_34c85756-25cf-4302-bd5d-72f2e459f562/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.173089 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b975ad31-5e47-43b2-a0c6-4d1ee9e50006/nova-api-log/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.226114 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_b975ad31-5e47-43b2-a0c6-4d1ee9e50006/nova-api-api/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.258965 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-c99c-account-create-update-5s889_03accbff-bdf2-4256-bdf2-1b39d5485673/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.372522 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-9jv87_0deb3f1a-0cad-4429-9e79-38e5a0b38896/mariadb-database-create/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.486213 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-4e76-account-create-update-vjcr6_523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86/mariadb-account-create-update/0.log" Mar 20 13:55:24 crc kubenswrapper[4755]: I0320 13:55:24.583276 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-vz8fw_2ff73477-b65b-4362-938c-94b1bb1f51b0/nova-manage/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.008892 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_676b01c6-a64d-4530-b157-10160afd719a/nova-cell0-conductor-conductor/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.016019 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-mbd9g_faef786e-b221-4fff-8d48-42b8163ed86a/nova-cell0-conductor-db-sync/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.131511 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-79jc8_32a5606c-c777-4c0b-951c-6ce2e03edd7e/mariadb-database-create/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.228422 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-7fs4m_557c5385-782c-410a-a371-b27f41d88a47/nova-manage/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.464926 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_32aa4c4f-3c67-46f5-90ae-59d17077eb1d/nova-cell1-conductor-conductor/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.492820 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-qbtvj_cadbdc7c-ed66-43d7-82ee-d797beb959a8/nova-cell1-conductor-db-sync/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.673815 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-jqk4f_f395acec-f28b-4622-b349-127cf31ec92d/mariadb-database-create/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.679767 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-ee84-account-create-update-jpmvf_39991203-9b8d-4985-8e90-b3d1772f6b8f/mariadb-account-create-update/0.log" Mar 20 13:55:25 crc kubenswrapper[4755]: I0320 13:55:25.931460 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_8318edf5-5648-4c19-8853-3d555435ed6f/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.041012 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80d8fd29-d89f-4955-86f3-a8137400c67b/nova-metadata-log/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.196637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_80d8fd29-d89f-4955-86f3-a8137400c67b/nova-metadata-metadata/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.283009 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_15175c96-bbe4-4a56-be68-a5db33909e54/nova-scheduler-scheduler/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.357477 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.563913 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.616037 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.625262 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_d1ede4a6-e06a-4084-8ba6-5f1c7f838bbe/galera/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.760144 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/mysql-bootstrap/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.826462 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_23ab8e52-0cde-43ec-af8d-24f794695200/galera/0.log" Mar 20 13:55:26 crc kubenswrapper[4755]: I0320 13:55:26.866824 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_96136572-ead6-4771-bd36-eec29b5fb137/openstackclient/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.028490 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-kbcdp_408d869f-0966-4908-88e5-37cdff345c4a/ovn-controller/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.106198 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nmwms_3a0e99e7-7429-41a7-bff7-23cafba6b78a/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.229353 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server-init/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.461761 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovs-vswitchd/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.501386 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server-init/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.538909 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-wbxnd_b2961ad5-0d2c-46e9-bb50-2e2893353945/ovsdb-server/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.679631 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1bdd912-fe33-4449-aed8-12a5ee09961e/ovn-northd/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.719560 4755 scope.go:117] "RemoveContainer" containerID="d32a03fa6c5ec614c940e4786d6b24b7cd59ebe20410aefe66d29da51483eac7" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.722637 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_d1bdd912-fe33-4449-aed8-12a5ee09961e/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.743972 4755 scope.go:117] "RemoveContainer" containerID="cc38e9370c808ee69a7f50b592873b1cbd16fcfb71225b312f2d6cb70c4fe9fd" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.795865 4755 scope.go:117] "RemoveContainer" containerID="bf3c4c3fe9431051d31c8d3be691fe02ec3059d025e2ec130cb4e7e269504bb9" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.898608 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de877bb8-b1cd-45de-94c1-5242659fd03e/openstack-network-exporter/0.log" Mar 20 13:55:27 crc kubenswrapper[4755]: I0320 13:55:27.910453 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_de877bb8-b1cd-45de-94c1-5242659fd03e/ovsdbserver-nb/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.113380 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fed1ecda-4acb-4a4c-a84e-12e58b3ad243/openstack-network-exporter/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.118569 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_fed1ecda-4acb-4a4c-a84e-12e58b3ad243/ovsdbserver-sb/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.259327 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65884d74bb-n9mkw_0187d784-0bbe-4f5f-9b84-ee240bb90970/placement-api/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.316390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-65884d74bb-n9mkw_0187d784-0bbe-4f5f-9b84-ee240bb90970/placement-log/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.384695 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-9157-account-create-update-q8r48_0587eb58-cd5e-4e0b-be30-97e0a569fc57/mariadb-account-create-update/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.516946 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-ng8vm_2af42784-d5cc-4f7c-832a-f91dbd54cc3f/mariadb-database-create/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.588129 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-cxr9p_7ea35a84-68ca-4490-b1d9-fa999ef63ebe/placement-db-sync/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.712999 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/setup-container/0.log" Mar 20 13:55:28 crc kubenswrapper[4755]: I0320 13:55:28.898648 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.006644 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.065246 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_6d21386c-8267-4dba-9028-d5cb729ff78b/rabbitmq/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.189331 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/setup-container/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.218886 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_c2ca344f-8f18-4dd9-9e5c-44669ff2da4f/rabbitmq/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.271214 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-jvtvk_8ae45e95-b96a-4157-a584-a6eb321d5091/mariadb-account-create-update/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.444386 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847679bbfc-l8kwj_12a81787-83e5-4552-85e6-19733309756d/proxy-server/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.539735 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-847679bbfc-l8kwj_12a81787-83e5-4552-85e6-19733309756d/proxy-httpd/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.576453 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-j55xs_c13f5042-e5e5-47a3-bc96-b504a0bf9af2/swift-ring-rebalance/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.734698 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-auditor/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.773982 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-reaper/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.779136 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-replicator/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.910129 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/account-server/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.961035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-auditor/0.log" Mar 20 13:55:29 crc kubenswrapper[4755]: I0320 13:55:29.992950 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-replicator/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.036298 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-server/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.119583 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/container-updater/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.138856 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:30 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:30 crc kubenswrapper[4755]: > Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.223138 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-expirer/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.226399 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-auditor/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.269449 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-replicator/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.315124 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-server/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.405056 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/object-updater/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.439757 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/rsync/0.log" Mar 20 13:55:30 crc kubenswrapper[4755]: I0320 13:55:30.487679 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_70300053-7713-4d2c-8e59-a123e9f0f189/swift-recon-cron/0.log" Mar 20 13:55:31 crc kubenswrapper[4755]: I0320 13:55:31.582957 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_1786d302-95f2-410e-8280-14a89cbaf48c/memcached/0.log" Mar 20 13:55:36 crc kubenswrapper[4755]: I0320 13:55:36.750977 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:55:36 crc kubenswrapper[4755]: I0320 13:55:36.751630 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:55:40 crc kubenswrapper[4755]: I0320 13:55:40.140195 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:40 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:40 crc kubenswrapper[4755]: > Mar 20 13:55:50 crc kubenswrapper[4755]: I0320 13:55:50.146312 4755 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" probeResult="failure" output=< Mar 20 13:55:50 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:55:50 crc kubenswrapper[4755]: > Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.376594 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.594971 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.647192 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.659783 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.836097 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/pull/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.846205 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/util/0.log" Mar 20 13:55:55 crc kubenswrapper[4755]: I0320 13:55:55.899119 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_5bb4269e2ca7c2864b38da7b430e3ab71ac70979e52ed2ca3c50a63be22r55v_74e2672d-2bea-46ce-961b-58decbe4a9c4/extract/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.096602 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-ds4tb_3a22a8d8-92cd-4177-a597-9c659673392c/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.434246 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-xd8mk_00fc80a4-4ea8-4f61-8795-6473f0adc40a/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.551514 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-cbj27_bc80030a-428b-4643-9d8d-2b0e9c873060/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.636067 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-92fwj_552e0390-e86e-4972-bf6f-a4570e6b6f81/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.831201 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-hxmnd_52210224-8989-4e16-8fdf-4ea3a8211b10/manager/0.log" Mar 20 13:55:56 crc kubenswrapper[4755]: I0320 13:55:56.902809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-pr9d5_4c1ba89a-aed6-4245-8411-4d1fecac2500/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.162414 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-f2nbs_21c9358d-2c84-4c38-9c91-8ca3dad4dab7/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.170331 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b55fff5bb-sm4wg_83d6120d-b54b-452c-aa8a-026665f1afae/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.282254 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-4x5nd_5a83ca27-3334-4aac-9129-5635d3af0714/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.365839 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gszjd_7f51051e-6a90-4582-a411-28a106c37118/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.502362 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-2d9hb_fe4ddc70-f382-4b32-8879-122023b45438/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.664168 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-c8crg_1aaef0d5-16fe-4c61-82d5-660f29168171/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.789316 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-r8jn7_358d4809-db3b-4468-8c8c-4ffbedc0ec89/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.861584 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-c5nbs_b3c037b9-79d2-45ea-9b92-66e50eb20e6b/manager/0.log" Mar 20 13:55:57 crc kubenswrapper[4755]: I0320 13:55:57.985790 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-86657c54f58nx7v_bad91c65-94da-4f8a-addb-21b037197217/manager/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.403809 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6c645d7445-cbmxt_e837c2d9-26ab-47a1-b48a-44f28fc2e2a6/operator/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.691477 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-98hpf_b58ff15e-f098-460d-ada4-3bdd990125ba/registry-server/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.804342 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-2xgt8_72ea5d65-7221-4b25-9025-7a5c31bae331/manager/0.log" Mar 20 13:55:58 crc kubenswrapper[4755]: I0320 13:55:58.965687 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-j7qf2_a1b32bae-fa65-45aa-a8db-b46a7351ee2c/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.028287 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-795d5ff795-ld7m6_42c9c167-c386-4d60-868c-8b0b63fccbcd/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.089824 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-bfh6x_14b4b9ba-026c-4fd7-a57d-545e62b6981e/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.150084 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.201209 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.280948 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-hpmzq_bc93761d-ecc1-4179-8287-40fd76ba5ad1/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.343679 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-4khh5_26dfba7a-f5fa-45bc-a187-91ddce4da2d6/manager/0.log" Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.386424 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:55:59 crc kubenswrapper[4755]: I0320 13:55:59.511944 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-z9px7_88862bd4-c890-447c-b4ee-b9cb1a4928e8/manager/0.log" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.140577 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.141771 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146084 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146247 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.146321 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.153429 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.295965 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.397390 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.419364 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"auto-csr-approver-29566916-jf8sk\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.507879 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.861406 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k6d2r" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" containerID="cri-o://e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" gracePeriod=2 Mar 20 13:56:00 crc kubenswrapper[4755]: I0320 13:56:00.986991 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.012696 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.270310 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315630 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315785 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.315878 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") pod \"2363273e-8f78-4383-a21b-23f0d8a234b4\" (UID: \"2363273e-8f78-4383-a21b-23f0d8a234b4\") " Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.318762 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities" (OuterVolumeSpecName: "utilities") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.333260 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq" (OuterVolumeSpecName: "kube-api-access-rw4gq") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "kube-api-access-rw4gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.417871 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.417906 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4gq\" (UniqueName: \"kubernetes.io/projected/2363273e-8f78-4383-a21b-23f0d8a234b4-kube-api-access-rw4gq\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.448306 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2363273e-8f78-4383-a21b-23f0d8a234b4" (UID: "2363273e-8f78-4383-a21b-23f0d8a234b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.519749 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2363273e-8f78-4383-a21b-23f0d8a234b4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.870572 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerStarted","Data":"12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873784 4755 generic.go:334] "Generic (PLEG): container finished" podID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" exitCode=0 Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873845 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873877 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k6d2r" event={"ID":"2363273e-8f78-4383-a21b-23f0d8a234b4","Type":"ContainerDied","Data":"52b02238107f8ca4e75a6ad385761dee4f2cd32405f08f6c4064824ba74ac65f"} Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.873898 4755 scope.go:117] "RemoveContainer" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.874038 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k6d2r" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.908837 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.931146 4755 scope.go:117] "RemoveContainer" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.931532 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k6d2r"] Mar 20 13:56:01 crc kubenswrapper[4755]: I0320 13:56:01.971545 4755 scope.go:117] "RemoveContainer" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.004837 4755 scope.go:117] "RemoveContainer" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.006641 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": container with ID starting with e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a not found: ID does not exist" containerID="e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.006696 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a"} err="failed to get container status \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": rpc error: code = NotFound desc = could not find container \"e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a\": container with ID starting with e987020b3328738e597a4b0d94ef6b3a32a426609445f37ae1200514baf5125a not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.006722 4755 scope.go:117] "RemoveContainer" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.008136 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": container with ID starting with 3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00 not found: ID does not exist" containerID="3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008188 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00"} err="failed to get container status \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": rpc error: code = NotFound desc = could not find container \"3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00\": container with ID starting with 3a0041e0a02de94bad27e2d486cce8a6eac322aacdfcc4d83adad3421f03ff00 not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008218 4755 scope.go:117] "RemoveContainer" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: E0320 13:56:02.008548 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": container with ID starting with 9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10 not found: ID does not exist" containerID="9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.008589 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10"} err="failed to get container status \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": rpc error: code = NotFound desc = could not find container \"9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10\": container with ID starting with 9f6a7995375f261e3c248ebc7995e0ca4e9d5270038872f1a6b83380aad0aa10 not found: ID does not exist" Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.885074 4755 generic.go:334] "Generic (PLEG): container finished" podID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerID="ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8" exitCode=0 Mar 20 13:56:02 crc kubenswrapper[4755]: I0320 13:56:02.885164 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerDied","Data":"ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8"} Mar 20 13:56:03 crc kubenswrapper[4755]: I0320 13:56:03.235752 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" path="/var/lib/kubelet/pods/2363273e-8f78-4383-a21b-23f0d8a234b4/volumes" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.267127 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.374932 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") pod \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\" (UID: \"c74d4c86-05c3-4ac3-a18e-cb75b4d95559\") " Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.383015 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2" (OuterVolumeSpecName: "kube-api-access-6mqs2") pod "c74d4c86-05c3-4ac3-a18e-cb75b4d95559" (UID: "c74d4c86-05c3-4ac3-a18e-cb75b4d95559"). InnerVolumeSpecName "kube-api-access-6mqs2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.478686 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqs2\" (UniqueName: \"kubernetes.io/projected/c74d4c86-05c3-4ac3-a18e-cb75b4d95559-kube-api-access-6mqs2\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919422 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" event={"ID":"c74d4c86-05c3-4ac3-a18e-cb75b4d95559","Type":"ContainerDied","Data":"12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0"} Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919832 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12e4c310d804ddc74263622c07486fe4db07e801bcdc1efdc2be905d37eaf6f0" Mar 20 13:56:04 crc kubenswrapper[4755]: I0320 13:56:04.919500 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-jf8sk" Mar 20 13:56:05 crc kubenswrapper[4755]: I0320 13:56:05.336382 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:56:05 crc kubenswrapper[4755]: I0320 13:56:05.344684 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-g8cp9"] Mar 20 13:56:06 crc kubenswrapper[4755]: I0320 13:56:06.751739 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:06 crc kubenswrapper[4755]: I0320 13:56:06.752144 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:07 crc kubenswrapper[4755]: I0320 13:56:07.237591 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74e82d0-07c7-4a72-baa4-9ec1e8427b5f" path="/var/lib/kubelet/pods/f74e82d0-07c7-4a72-baa4-9ec1e8427b5f/volumes" Mar 20 13:56:20 crc kubenswrapper[4755]: I0320 13:56:20.911549 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-2bm86_85fb2982-9af0-4450-80f4-12fbd6e7a590/control-plane-machine-set-operator/0.log" Mar 20 13:56:21 crc kubenswrapper[4755]: I0320 13:56:21.103818 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4zdx6_1fdd6691-9136-43ba-abea-7ba6862e9681/kube-rbac-proxy/0.log" Mar 20 13:56:21 crc kubenswrapper[4755]: I0320 13:56:21.172781 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4zdx6_1fdd6691-9136-43ba-abea-7ba6862e9681/machine-api-operator/0.log" Mar 20 13:56:27 crc kubenswrapper[4755]: I0320 13:56:27.982438 4755 scope.go:117] "RemoveContainer" containerID="5e291591338cd28d27f7c79f1207aa9e8798379d161e84714c136ccdd26f3418" Mar 20 13:56:28 crc kubenswrapper[4755]: I0320 13:56:28.039149 4755 scope.go:117] "RemoveContainer" containerID="bc5c594bc79ce85ad85bbd3d37f64dfa62a65d2829adf1689e00f118e765dbae" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.351907 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-7gpgn_cdf5c938-39f0-46a4-bce6-1a0cf67624ab/cert-manager-controller/0.log" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.423197 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-l955j_a3125fba-bed9-40d3-b53d-f976488e12d2/cert-manager-cainjector/0.log" Mar 20 13:56:35 crc kubenswrapper[4755]: I0320 13:56:35.535575 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-hbz2p_f3b802e1-c690-4817-91cf-d721cbfae51c/cert-manager-webhook/0.log" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.637524 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.637996 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638014 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638056 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638064 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638082 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-content" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638091 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-content" Mar 20 13:56:36 crc kubenswrapper[4755]: E0320 13:56:36.638103 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-utilities" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638110 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="extract-utilities" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638337 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" containerName="oc" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.638357 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="2363273e-8f78-4383-a21b-23f0d8a234b4" containerName="registry-server" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.640064 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.662423 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725239 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725642 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.725784 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751313 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751366 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751402 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.751999 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.752054 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" gracePeriod=600 Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828086 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828151 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828204 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828680 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.828766 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.851279 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"certified-operators-m5fpr\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:36 crc kubenswrapper[4755]: I0320 13:56:36.957918 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211357 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" exitCode=0 Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211740 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a"} Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211765 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerStarted","Data":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.211781 4755 scope.go:117] "RemoveContainer" containerID="0d23ac7a061484f91b8a9a00afe1aaab054547a5c7ed091e4329b2edf9a01e74" Mar 20 13:56:37 crc kubenswrapper[4755]: I0320 13:56:37.541833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:37 crc kubenswrapper[4755]: W0320 13:56:37.546071 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda94c1d23_46c0_439e_8d37_f3bbfeed4646.slice/crio-8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041 WatchSource:0}: Error finding container 8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041: Status 404 returned error can't find the container with id 8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041 Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.227928 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" exitCode=0 Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.227982 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50"} Mar 20 13:56:38 crc kubenswrapper[4755]: I0320 13:56:38.228361 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041"} Mar 20 13:56:39 crc kubenswrapper[4755]: I0320 13:56:39.242443 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} Mar 20 13:56:40 crc kubenswrapper[4755]: I0320 13:56:40.254838 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" exitCode=0 Mar 20 13:56:40 crc kubenswrapper[4755]: I0320 13:56:40.254887 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} Mar 20 13:56:41 crc kubenswrapper[4755]: I0320 13:56:41.267182 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerStarted","Data":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} Mar 20 13:56:41 crc kubenswrapper[4755]: I0320 13:56:41.299349 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5fpr" podStartSLOduration=2.822440043 podStartE2EDuration="5.299325827s" podCreationTimestamp="2026-03-20 13:56:36 +0000 UTC" firstStartedPulling="2026-03-20 13:56:38.230534338 +0000 UTC m=+1577.828466907" lastFinishedPulling="2026-03-20 13:56:40.707420162 +0000 UTC m=+1580.305352691" observedRunningTime="2026-03-20 13:56:41.28810232 +0000 UTC m=+1580.886034849" watchObservedRunningTime="2026-03-20 13:56:41.299325827 +0000 UTC m=+1580.897258366" Mar 20 13:56:46 crc kubenswrapper[4755]: I0320 13:56:46.958724 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:46 crc kubenswrapper[4755]: I0320 13:56:46.959476 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.015156 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.409531 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:47 crc kubenswrapper[4755]: I0320 13:56:47.474065 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.351519 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-2hkvk_a9993046-1fc7-4faa-a634-f91339d94c71/nmstate-console-plugin/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.353833 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m5fpr" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" containerID="cri-o://706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" gracePeriod=2 Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.530648 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-dspfd_2e4b8ce9-115c-4c39-9f1b-a5681ded9b68/nmstate-handler/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.589034 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-68g6g_284e4beb-7815-41fc-ac59-95ed647c0d7c/kube-rbac-proxy/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.664148 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-68g6g_284e4beb-7815-41fc-ac59-95ed647c0d7c/nmstate-metrics/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.761366 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-rz567_93adf7be-d696-48e2-b6d5-af27b19b24e3/nmstate-operator/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.832412 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.863390 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-72787_36f8cd57-a5ee-4a30-b7b6-8f13d698861c/nmstate-webhook/0.log" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882594 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882640 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.882718 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") pod \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\" (UID: \"a94c1d23-46c0-439e-8d37-f3bbfeed4646\") " Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.883733 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities" (OuterVolumeSpecName: "utilities") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.889883 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9" (OuterVolumeSpecName: "kube-api-access-klpq9") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "kube-api-access-klpq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.985471 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:49 crc kubenswrapper[4755]: I0320 13:56:49.985509 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpq9\" (UniqueName: \"kubernetes.io/projected/a94c1d23-46c0-439e-8d37-f3bbfeed4646-kube-api-access-klpq9\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364115 4755 generic.go:334] "Generic (PLEG): container finished" podID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" exitCode=0 Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364160 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364180 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5fpr" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364198 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5fpr" event={"ID":"a94c1d23-46c0-439e-8d37-f3bbfeed4646","Type":"ContainerDied","Data":"8a64e3af1618b1bd93879057a4895927e080444935d1460c499da26a6ed14041"} Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.364222 4755 scope.go:117] "RemoveContainer" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.396299 4755 scope.go:117] "RemoveContainer" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.422802 4755 scope.go:117] "RemoveContainer" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.467989 4755 scope.go:117] "RemoveContainer" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.468397 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": container with ID starting with 706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52 not found: ID does not exist" containerID="706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468450 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52"} err="failed to get container status \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": rpc error: code = NotFound desc = could not find container \"706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52\": container with ID starting with 706104a477d145f847883bc3e5a497d2243127f72bff713f5a5c423496b17b52 not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468495 4755 scope.go:117] "RemoveContainer" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.468868 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": container with ID starting with a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b not found: ID does not exist" containerID="a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468903 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b"} err="failed to get container status \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": rpc error: code = NotFound desc = could not find container \"a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b\": container with ID starting with a4f2d85737bffe5fe8e08b82ec33264eae42e2dafc647a2e2ded635daf70471b not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.468926 4755 scope.go:117] "RemoveContainer" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: E0320 13:56:50.469334 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": container with ID starting with c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50 not found: ID does not exist" containerID="c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.469387 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50"} err="failed to get container status \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": rpc error: code = NotFound desc = could not find container \"c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50\": container with ID starting with c40b1fb2c5c6144cfa3cc8e63015875f64e3c257230c4da0229d4162086d7b50 not found: ID does not exist" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.532331 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a94c1d23-46c0-439e-8d37-f3bbfeed4646" (UID: "a94c1d23-46c0-439e-8d37-f3bbfeed4646"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.598730 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a94c1d23-46c0-439e-8d37-f3bbfeed4646-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.697167 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:50 crc kubenswrapper[4755]: I0320 13:56:50.710165 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m5fpr"] Mar 20 13:56:51 crc kubenswrapper[4755]: I0320 13:56:51.241939 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" path="/var/lib/kubelet/pods/a94c1d23-46c0-439e-8d37-f3bbfeed4646/volumes" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.844773 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845721 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-utilities" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845734 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-utilities" Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845757 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845763 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: E0320 13:57:06.845794 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-content" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845800 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="extract-content" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.845959 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a94c1d23-46c0-439e-8d37-f3bbfeed4646" containerName="registry-server" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.847182 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.866689 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895437 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895591 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.895632 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.997842 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998026 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998080 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998501 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:06 crc kubenswrapper[4755]: I0320 13:57:06.998601 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.017017 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"redhat-marketplace-4wl28\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.175119 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:07 crc kubenswrapper[4755]: I0320 13:57:07.663463 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:08 crc kubenswrapper[4755]: E0320 13:57:08.090014 4755 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a373048_a6fb_43f3_86bf_cc41057c8ecd.slice/crio-conmon-5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a373048_a6fb_43f3_86bf_cc41057c8ecd.slice/crio-5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539756 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" exitCode=0 Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539802 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40"} Mar 20 13:57:08 crc kubenswrapper[4755]: I0320 13:57:08.539828 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"30a8bd4f907483e441f264aeaf06d49007454d93dd8d2113c45351adc85d9f47"} Mar 20 13:57:09 crc kubenswrapper[4755]: I0320 13:57:09.550856 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} Mar 20 13:57:10 crc kubenswrapper[4755]: I0320 13:57:10.562898 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" exitCode=0 Mar 20 13:57:10 crc kubenswrapper[4755]: I0320 13:57:10.563817 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} Mar 20 13:57:11 crc kubenswrapper[4755]: I0320 13:57:11.575354 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerStarted","Data":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} Mar 20 13:57:11 crc kubenswrapper[4755]: I0320 13:57:11.595826 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4wl28" podStartSLOduration=3.146818998 podStartE2EDuration="5.595808274s" podCreationTimestamp="2026-03-20 13:57:06 +0000 UTC" firstStartedPulling="2026-03-20 13:57:08.541491128 +0000 UTC m=+1608.139423667" lastFinishedPulling="2026-03-20 13:57:10.990480414 +0000 UTC m=+1610.588412943" observedRunningTime="2026-03-20 13:57:11.590453263 +0000 UTC m=+1611.188385802" watchObservedRunningTime="2026-03-20 13:57:11.595808274 +0000 UTC m=+1611.193740803" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.175877 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.176622 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.239753 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.701682 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:17 crc kubenswrapper[4755]: I0320 13:57:17.748907 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.277535 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qsbbn_a71f1548-62b5-4a77-9655-735bafa396c8/kube-rbac-proxy/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.401309 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-qsbbn_a71f1548-62b5-4a77-9655-735bafa396c8/controller/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.464744 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.689594 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.689633 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.706641 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.707861 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.875175 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.893465 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.909001 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:18 crc kubenswrapper[4755]: I0320 13:57:18.913599 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.044295 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-frr-files/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.061821 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-reloader/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.068956 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/cp-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.093762 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/controller/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.228471 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/frr-metrics/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.284035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/kube-rbac-proxy/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.310324 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/kube-rbac-proxy-frr/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.437445 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/reloader/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.526878 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-7xgrp_490ee5e7-c0b1-4181-b7ac-86e5e61253a0/frr-k8s-webhook-server/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.671360 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4wl28" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" containerID="cri-o://85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" gracePeriod=2 Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.767392 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6ddbc48b88-k4d8p_32289872-a679-4d10-8b2f-0519c713dc35/manager/0.log" Mar 20 13:57:19 crc kubenswrapper[4755]: I0320 13:57:19.938213 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-588c694cdc-8vjlb_f0274fca-6425-402c-a2aa-853b232ad93c/webhook-server/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.016015 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6vf4n_839a8db3-662c-41c4-bb63-6b1027901ab5/kube-rbac-proxy/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.097160 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-5l5hs_1152c78e-15f9-4826-acc3-3d7f5765db68/frr/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.284026 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.475896 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.476170 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.476203 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") pod \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\" (UID: \"5a373048-a6fb-43f3-86bf-cc41057c8ecd\") " Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.477739 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities" (OuterVolumeSpecName: "utilities") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.484150 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4" (OuterVolumeSpecName: "kube-api-access-4w7h4") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "kube-api-access-4w7h4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.491163 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-6vf4n_839a8db3-662c-41c4-bb63-6b1027901ab5/speaker/0.log" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.506244 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a373048-a6fb-43f3-86bf-cc41057c8ecd" (UID: "5a373048-a6fb-43f3-86bf-cc41057c8ecd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578686 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w7h4\" (UniqueName: \"kubernetes.io/projected/5a373048-a6fb-43f3-86bf-cc41057c8ecd-kube-api-access-4w7h4\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578724 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.578733 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a373048-a6fb-43f3-86bf-cc41057c8ecd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682033 4755 generic.go:334] "Generic (PLEG): container finished" podID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" exitCode=0 Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682072 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682286 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wl28" event={"ID":"5a373048-a6fb-43f3-86bf-cc41057c8ecd","Type":"ContainerDied","Data":"30a8bd4f907483e441f264aeaf06d49007454d93dd8d2113c45351adc85d9f47"} Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682307 4755 scope.go:117] "RemoveContainer" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.682307 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wl28" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.702221 4755 scope.go:117] "RemoveContainer" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.713162 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.722108 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wl28"] Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.734897 4755 scope.go:117] "RemoveContainer" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768400 4755 scope.go:117] "RemoveContainer" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.768865 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": container with ID starting with 85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d not found: ID does not exist" containerID="85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768909 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d"} err="failed to get container status \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": rpc error: code = NotFound desc = could not find container \"85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d\": container with ID starting with 85433c71b1540d48d3303ae570b22f1e08e90d9647a56f16767bc04ee81ff80d not found: ID does not exist" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.768938 4755 scope.go:117] "RemoveContainer" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.769234 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": container with ID starting with 212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a not found: ID does not exist" containerID="212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769265 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a"} err="failed to get container status \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": rpc error: code = NotFound desc = could not find container \"212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a\": container with ID starting with 212317afb0e07b989ebb6ddb0e3a73999610e0a62082289d795a82c50c8fc94a not found: ID does not exist" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769287 4755 scope.go:117] "RemoveContainer" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: E0320 13:57:20.769708 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": container with ID starting with 5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40 not found: ID does not exist" containerID="5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40" Mar 20 13:57:20 crc kubenswrapper[4755]: I0320 13:57:20.769734 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40"} err="failed to get container status \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": rpc error: code = NotFound desc = could not find container \"5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40\": container with ID starting with 5c9e25893b88d1a3353e5752a99ff0a67bf5ad2e768e7a5cd7de500ff3b88e40 not found: ID does not exist" Mar 20 13:57:21 crc kubenswrapper[4755]: I0320 13:57:21.237945 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" path="/var/lib/kubelet/pods/5a373048-a6fb-43f3-86bf-cc41057c8ecd/volumes" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.678045 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.811285 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.909988 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:33 crc kubenswrapper[4755]: I0320 13:57:33.919566 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.049898 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.051186 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.133070 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874xrmsv_73d92c92-af26-4aa9-a774-04a1ef37b3c7/extract/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.236162 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.402064 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.421153 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.422087 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.596450 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/util/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.602526 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/extract/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.605086 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1nmvgl_0e8346da-4c59-4f8f-9804-02ad176bc15d/pull/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.765435 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.925474 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.937339 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:34 crc kubenswrapper[4755]: I0320 13:57:34.948057 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.132071 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.151273 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.384502 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.459434 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.470276 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-nql9k_2b421640-e220-4567-8600-8e0ba78a981a/registry-server/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.480638 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.569359 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.728170 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-content/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.731888 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/extract-utilities/0.log" Mar 20 13:57:35 crc kubenswrapper[4755]: I0320 13:57:35.956585 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-ngw4b_6d1fc18c-b364-439b-926f-12fe310d0917/marketplace-operator/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.023540 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.063463 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6g8x4_504e1957-f41e-4927-927f-d5ac7e8eb625/registry-server/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.161442 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.204953 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.211616 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.412274 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.430035 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.479277 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-srzwn_1107b669-3bdf-4189-a37a-b79ddb758fff/registry-server/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.609118 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.790596 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.792375 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.816843 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:36 crc kubenswrapper[4755]: I0320 13:57:36.953636 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-utilities/0.log" Mar 20 13:57:37 crc kubenswrapper[4755]: I0320 13:57:37.047770 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/extract-content/0.log" Mar 20 13:57:37 crc kubenswrapper[4755]: I0320 13:57:37.214047 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-c9s6q_f483e049-5032-496f-8608-494e07922763/registry-server/0.log" Mar 20 13:57:53 crc kubenswrapper[4755]: E0320 13:57:53.495332 4755 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.181:39742->38.102.83.181:38787: write tcp 38.102.83.181:39742->38.102.83.181:38787: write: broken pipe Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.174156 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175493 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175506 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175531 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175537 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: E0320 13:58:00.175547 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175553 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.175844 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a373048-a6fb-43f3-86bf-cc41057c8ecd" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.176533 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.179560 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.179834 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.180006 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.192833 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.208866 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.311928 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.333423 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"auto-csr-approver-29566918-krhnp\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:00 crc kubenswrapper[4755]: I0320 13:58:00.507775 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:01 crc kubenswrapper[4755]: I0320 13:58:01.026030 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-krhnp"] Mar 20 13:58:01 crc kubenswrapper[4755]: I0320 13:58:01.088256 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerStarted","Data":"600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7"} Mar 20 13:58:03 crc kubenswrapper[4755]: I0320 13:58:03.118297 4755 generic.go:334] "Generic (PLEG): container finished" podID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerID="6b93c4f221413ce21758a54cb81f9dd1307ec1714e9b3709e5e630c41008370d" exitCode=0 Mar 20 13:58:03 crc kubenswrapper[4755]: I0320 13:58:03.118424 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerDied","Data":"6b93c4f221413ce21758a54cb81f9dd1307ec1714e9b3709e5e630c41008370d"} Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.555134 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.599901 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") pod \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\" (UID: \"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6\") " Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.609710 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj" (OuterVolumeSpecName: "kube-api-access-bhqhj") pod "1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" (UID: "1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6"). InnerVolumeSpecName "kube-api-access-bhqhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:04 crc kubenswrapper[4755]: I0320 13:58:04.701707 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhqhj\" (UniqueName: \"kubernetes.io/projected/1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6-kube-api-access-bhqhj\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.136975 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-krhnp" event={"ID":"1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6","Type":"ContainerDied","Data":"600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7"} Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.137402 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="600d812ad812154760a23e2a9ec4cfe3f80ac8d745075a008606eb069cbe49f7" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.137464 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-krhnp" Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.632688 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:58:05 crc kubenswrapper[4755]: I0320 13:58:05.645731 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-cl4dk"] Mar 20 13:58:07 crc kubenswrapper[4755]: I0320 13:58:07.235700 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cf774b-eb80-4f5b-a7de-2012636d36c5" path="/var/lib/kubelet/pods/78cf774b-eb80-4f5b-a7de-2012636d36c5/volumes" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.190582 4755 scope.go:117] "RemoveContainer" containerID="72b2f009d2a4423710b2308fccd453e64decc2036c9ffeba13690d2169eaf608" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.250917 4755 scope.go:117] "RemoveContainer" containerID="b71ae13336387ea23999ea11909327df7acce7d15838a4a3cc47714c51e01dd7" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.299431 4755 scope.go:117] "RemoveContainer" containerID="e540a31f1146615f2ed1e93f0d4499bf812a64fdbca502fc17c0ae21a3e0859b" Mar 20 13:58:28 crc kubenswrapper[4755]: I0320 13:58:28.329416 4755 scope.go:117] "RemoveContainer" containerID="56e5da3a4d3cca130e732f0fa36ea2a7898e408a1dbb25a62bc54c120646cff0" Mar 20 13:58:41 crc kubenswrapper[4755]: I0320 13:58:41.630839 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-marketplace-srzwn" podUID="1107b669-3bdf-4189-a37a-b79ddb758fff" containerName="registry-server" probeResult="failure" output=< Mar 20 13:58:41 crc kubenswrapper[4755]: timeout: failed to connect service ":50051" within 1s Mar 20 13:58:41 crc kubenswrapper[4755]: > Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.067306 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.094161 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.105198 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-g2hvs"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.116897 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-8e2f-account-create-update-cvvh2"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.126371 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.133196 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.147775 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-ng8vm"] Mar 20 13:59:00 crc kubenswrapper[4755]: I0320 13:59:00.159092 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-9157-account-create-update-q8r48"] Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.268073 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0587eb58-cd5e-4e0b-be30-97e0a569fc57" path="/var/lib/kubelet/pods/0587eb58-cd5e-4e0b-be30-97e0a569fc57/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.270179 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0795b626-b382-4b9b-beb5-802cebc4f764" path="/var/lib/kubelet/pods/0795b626-b382-4b9b-beb5-802cebc4f764/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.271965 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2af42784-d5cc-4f7c-832a-f91dbd54cc3f" path="/var/lib/kubelet/pods/2af42784-d5cc-4f7c-832a-f91dbd54cc3f/volumes" Mar 20 13:59:01 crc kubenswrapper[4755]: I0320 13:59:01.273889 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79c00857-0d6a-4c12-8581-da16e2a24f04" path="/var/lib/kubelet/pods/79c00857-0d6a-4c12-8581-da16e2a24f04/volumes" Mar 20 13:59:06 crc kubenswrapper[4755]: I0320 13:59:06.751162 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:06 crc kubenswrapper[4755]: I0320 13:59:06.752296 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.034760 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.043358 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-35fe-account-create-update-h6fl8"] Mar 20 13:59:07 crc kubenswrapper[4755]: I0320 13:59:07.237023 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46d041c2-e231-49fd-9d88-a991a1b9dd65" path="/var/lib/kubelet/pods/46d041c2-e231-49fd-9d88-a991a1b9dd65/volumes" Mar 20 13:59:08 crc kubenswrapper[4755]: I0320 13:59:08.033163 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:59:08 crc kubenswrapper[4755]: I0320 13:59:08.041344 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-pg2bq"] Mar 20 13:59:09 crc kubenswrapper[4755]: I0320 13:59:09.242257 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe77db3-29ef-42ae-840b-9736f07188ca" path="/var/lib/kubelet/pods/6fe77db3-29ef-42ae-840b-9736f07188ca/volumes" Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.919055 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" exitCode=0 Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.919110 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" event={"ID":"9e7e4d4d-749a-4ec8-89f4-1362f7787e43","Type":"ContainerDied","Data":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} Mar 20 13:59:15 crc kubenswrapper[4755]: I0320 13:59:15.920322 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:16 crc kubenswrapper[4755]: I0320 13:59:16.959443 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/gather/0.log" Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.980097 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.980975 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" containerID="cri-o://5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" gracePeriod=2 Mar 20 13:59:24 crc kubenswrapper[4755]: I0320 13:59:24.992694 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6z9hz/must-gather-2mgxd"] Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.487560 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/copy/0.log" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.488555 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.654497 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") pod \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.654705 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") pod \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\" (UID: \"9e7e4d4d-749a-4ec8-89f4-1362f7787e43\") " Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.666488 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh" (OuterVolumeSpecName: "kube-api-access-8wmwh") pod "9e7e4d4d-749a-4ec8-89f4-1362f7787e43" (UID: "9e7e4d4d-749a-4ec8-89f4-1362f7787e43"). InnerVolumeSpecName "kube-api-access-8wmwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.756861 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wmwh\" (UniqueName: \"kubernetes.io/projected/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-kube-api-access-8wmwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.798921 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9e7e4d4d-749a-4ec8-89f4-1362f7787e43" (UID: "9e7e4d4d-749a-4ec8-89f4-1362f7787e43"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:25 crc kubenswrapper[4755]: I0320 13:59:25.862454 4755 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9e7e4d4d-749a-4ec8-89f4-1362f7787e43-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023470 4755 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6z9hz_must-gather-2mgxd_9e7e4d4d-749a-4ec8-89f4-1362f7787e43/copy/0.log" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023890 4755 generic.go:334] "Generic (PLEG): container finished" podID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" exitCode=143 Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.023942 4755 scope.go:117] "RemoveContainer" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.024096 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6z9hz/must-gather-2mgxd" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.043473 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.092887 4755 scope.go:117] "RemoveContainer" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: E0320 13:59:26.093355 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": container with ID starting with 5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4 not found: ID does not exist" containerID="5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093399 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4"} err="failed to get container status \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": rpc error: code = NotFound desc = could not find container \"5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4\": container with ID starting with 5044e599e3559dba946b3a8f9a4ec106563ee732626103d0a13a74d14bf2e5f4 not found: ID does not exist" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093424 4755 scope.go:117] "RemoveContainer" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: E0320 13:59:26.093871 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": container with ID starting with e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde not found: ID does not exist" containerID="e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde" Mar 20 13:59:26 crc kubenswrapper[4755]: I0320 13:59:26.093900 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde"} err="failed to get container status \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": rpc error: code = NotFound desc = could not find container \"e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde\": container with ID starting with e7c7416f2b66a7ba2cb634748a2002212ba74895e3801daedaa219d054eabdde not found: ID does not exist" Mar 20 13:59:27 crc kubenswrapper[4755]: I0320 13:59:27.260441 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" path="/var/lib/kubelet/pods/9e7e4d4d-749a-4ec8-89f4-1362f7787e43/volumes" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.053716 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.064875 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-jvtvk"] Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.459695 4755 scope.go:117] "RemoveContainer" containerID="674d9ffa621b68cb8896394d2c6b14777127beaa89643c84451bf059d06cd1b2" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.502674 4755 scope.go:117] "RemoveContainer" containerID="618346b79e083765f10b1f9711db81434bce17a9061d4c4ad4ee22f20d0cf810" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.558478 4755 scope.go:117] "RemoveContainer" containerID="80421922ee03370d8129c89d049852fafb9668bfbc7740f1e18b91bb761a74fb" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.600213 4755 scope.go:117] "RemoveContainer" containerID="0e99c115cd3bb8a5a015878c8ebe0d9d286614ffed9e1d567c117c40a7a290d4" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.646982 4755 scope.go:117] "RemoveContainer" containerID="ff19f0ef0d1d01eed12831ecc96428fd14328a94403e4c3c46e9e68449f748a3" Mar 20 13:59:28 crc kubenswrapper[4755]: I0320 13:59:28.716262 4755 scope.go:117] "RemoveContainer" containerID="9125a1bda9b536f2a5e021d1b2954a97e6d55ac5d5d380145fa4a013b9bba955" Mar 20 13:59:29 crc kubenswrapper[4755]: I0320 13:59:29.245105 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ae45e95-b96a-4157-a584-a6eb321d5091" path="/var/lib/kubelet/pods/8ae45e95-b96a-4157-a584-a6eb321d5091/volumes" Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.031800 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.055574 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.063062 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.102945 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2jwbt"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.116999 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.123886 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-jm9nr"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.130100 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-9528-account-create-update-6xkmx"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.138376 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-hm9qz"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.144699 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.150853 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-1376-account-create-update-jhbhp"] Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.751616 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:36 crc kubenswrapper[4755]: I0320 13:59:36.751710 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.241727 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="015c8ae7-1856-4b0c-b5ce-e2503a2080dc" path="/var/lib/kubelet/pods/015c8ae7-1856-4b0c-b5ce-e2503a2080dc/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.242551 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dde547e-5fce-4868-ba0e-63650ea0c771" path="/var/lib/kubelet/pods/5dde547e-5fce-4868-ba0e-63650ea0c771/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.248477 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c5d05dc-a589-4d2e-9374-0d57202a3cfc" path="/var/lib/kubelet/pods/8c5d05dc-a589-4d2e-9374-0d57202a3cfc/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.249131 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e38d31ac-eae6-4cd1-be04-304215db852a" path="/var/lib/kubelet/pods/e38d31ac-eae6-4cd1-be04-304215db852a/volumes" Mar 20 13:59:37 crc kubenswrapper[4755]: I0320 13:59:37.249777 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feb55e83-711d-4561-8b57-2a231944e1b1" path="/var/lib/kubelet/pods/feb55e83-711d-4561-8b57-2a231944e1b1/volumes" Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.047285 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.064362 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.081833 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-w78rr"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.096230 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-fc04-account-create-update-x9t57"] Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.239647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3047e6fe-5128-4361-bede-e9f0c4e9387c" path="/var/lib/kubelet/pods/3047e6fe-5128-4361-bede-e9f0c4e9387c/volumes" Mar 20 13:59:39 crc kubenswrapper[4755]: I0320 13:59:39.240399 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34c85756-25cf-4302-bd5d-72f2e459f562" path="/var/lib/kubelet/pods/34c85756-25cf-4302-bd5d-72f2e459f562/volumes" Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.038850 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.047398 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-9xrbx"] Mar 20 13:59:45 crc kubenswrapper[4755]: I0320 13:59:45.248401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64ad8e64-0606-4171-bd2d-ae8212fdff8f" path="/var/lib/kubelet/pods/64ad8e64-0606-4171-bd2d-ae8212fdff8f/volumes" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.166420 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167417 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167462 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167499 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167508 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: E0320 14:00:00.167539 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167548 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167877 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="gather" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167905 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e9a1a08-ee20-4deb-ac7f-ef66d1e624b6" containerName="oc" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.167924 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e4d4d-749a-4ec8-89f4-1362f7787e43" containerName="copy" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.168941 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.171330 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.172246 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.191207 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.268768 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.271395 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.273774 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.274843 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.274882 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.279578 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.319837 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.320761 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.320928 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423145 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423245 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423355 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.423409 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.424934 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.439124 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.453902 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"collect-profiles-29566920-2bfsw\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.511379 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.525724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.561170 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"auto-csr-approver-29566920-b2lfj\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.602166 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:00 crc kubenswrapper[4755]: I0320 14:00:00.993040 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw"] Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.135476 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-b2lfj"] Mar 20 14:00:01 crc kubenswrapper[4755]: W0320 14:00:01.145631 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc07cef7a_c5f6_4f4b_8508_8d499928b255.slice/crio-832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14 WatchSource:0}: Error finding container 832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14: Status 404 returned error can't find the container with id 832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14 Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.419438 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerStarted","Data":"832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14"} Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422486 4755 generic.go:334] "Generic (PLEG): container finished" podID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerID="c6bc10e055ad85ee77ba59e601c3da9aa174ba7f0b28a4725b0f849a16bdfa98" exitCode=0 Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422520 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerDied","Data":"c6bc10e055ad85ee77ba59e601c3da9aa174ba7f0b28a4725b0f849a16bdfa98"} Mar 20 14:00:01 crc kubenswrapper[4755]: I0320 14:00:01.422540 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerStarted","Data":"6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4"} Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.922753 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986408 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986472 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.986531 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") pod \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\" (UID: \"e6e921d7-2303-42ff-ac0e-89b8b15127e4\") " Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.987504 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume" (OuterVolumeSpecName: "config-volume") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.993595 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:02 crc kubenswrapper[4755]: I0320 14:00:02.995864 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq" (OuterVolumeSpecName: "kube-api-access-vqdzq") pod "e6e921d7-2303-42ff-ac0e-89b8b15127e4" (UID: "e6e921d7-2303-42ff-ac0e-89b8b15127e4"). InnerVolumeSpecName "kube-api-access-vqdzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088859 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqdzq\" (UniqueName: \"kubernetes.io/projected/e6e921d7-2303-42ff-ac0e-89b8b15127e4-kube-api-access-vqdzq\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088898 4755 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e6e921d7-2303-42ff-ac0e-89b8b15127e4-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.088910 4755 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6e921d7-2303-42ff-ac0e-89b8b15127e4-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.450546 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" event={"ID":"e6e921d7-2303-42ff-ac0e-89b8b15127e4","Type":"ContainerDied","Data":"6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4"} Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.451016 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a99d7803f3d183b8dee018512af84388d5a11492309961badfbea7208dd5ba4" Mar 20 14:00:03 crc kubenswrapper[4755]: I0320 14:00:03.450634 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-2bfsw" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751051 4755 patch_prober.go:28] interesting pod/machine-config-daemon-xmn6s container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751563 4755 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.751635 4755 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.752986 4755 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:00:06 crc kubenswrapper[4755]: I0320 14:00:06.753094 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerName="machine-config-daemon" containerID="cri-o://0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" gracePeriod=600 Mar 20 14:00:06 crc kubenswrapper[4755]: E0320 14:00:06.896533 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495573 4755 generic.go:334] "Generic (PLEG): container finished" podID="3eb406f6-1a26-4eea-84ac-e55f5232900c" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" exitCode=0 Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495672 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" event={"ID":"3eb406f6-1a26-4eea-84ac-e55f5232900c","Type":"ContainerDied","Data":"0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154"} Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.495760 4755 scope.go:117] "RemoveContainer" containerID="8d34e90f5a770eadc4886ddd1ae59ecc5646e9ed932580ebfcde846e475cb74a" Mar 20 14:00:07 crc kubenswrapper[4755]: I0320 14:00:07.497178 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:07 crc kubenswrapper[4755]: E0320 14:00:07.498115 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.035813 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.046074 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-52m67"] Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.244827 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69707be4-e338-4e13-8ecc-8cfd7cd416b2" path="/var/lib/kubelet/pods/69707be4-e338-4e13-8ecc-8cfd7cd416b2/volumes" Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.606428 4755 generic.go:334] "Generic (PLEG): container finished" podID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerID="be6f17dea20c6888dee20766234195b91965ff23909e1764b2d47f7abaf02c60" exitCode=0 Mar 20 14:00:15 crc kubenswrapper[4755]: I0320 14:00:15.606493 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerDied","Data":"be6f17dea20c6888dee20766234195b91965ff23909e1764b2d47f7abaf02c60"} Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.005683 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.019557 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") pod \"c07cef7a-c5f6-4f4b-8508-8d499928b255\" (UID: \"c07cef7a-c5f6-4f4b-8508-8d499928b255\") " Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.025148 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf" (OuterVolumeSpecName: "kube-api-access-ns8vf") pod "c07cef7a-c5f6-4f4b-8508-8d499928b255" (UID: "c07cef7a-c5f6-4f4b-8508-8d499928b255"). InnerVolumeSpecName "kube-api-access-ns8vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.122277 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns8vf\" (UniqueName: \"kubernetes.io/projected/c07cef7a-c5f6-4f4b-8508-8d499928b255-kube-api-access-ns8vf\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643324 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" event={"ID":"c07cef7a-c5f6-4f4b-8508-8d499928b255","Type":"ContainerDied","Data":"832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14"} Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643610 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="832468751d91134949fed854a44ac7de8fd07687e1f317da50dfd2326a76eb14" Mar 20 14:00:17 crc kubenswrapper[4755]: I0320 14:00:17.643388 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-b2lfj" Mar 20 14:00:18 crc kubenswrapper[4755]: I0320 14:00:18.073543 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 14:00:18 crc kubenswrapper[4755]: I0320 14:00:18.084942 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-xfmxl"] Mar 20 14:00:19 crc kubenswrapper[4755]: I0320 14:00:19.237108 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea7c11fe-b29d-4fa4-a46d-7079105e883e" path="/var/lib/kubelet/pods/ea7c11fe-b29d-4fa4-a46d-7079105e883e/volumes" Mar 20 14:00:21 crc kubenswrapper[4755]: I0320 14:00:21.235316 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:21 crc kubenswrapper[4755]: E0320 14:00:21.236825 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.042043 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.051951 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rwsvb"] Mar 20 14:00:27 crc kubenswrapper[4755]: I0320 14:00:27.243153 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dddb768-c318-44b8-bac9-ea26f29ca038" path="/var/lib/kubelet/pods/5dddb768-c318-44b8-bac9-ea26f29ca038/volumes" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.859190 4755 scope.go:117] "RemoveContainer" containerID="46a3e9d432eab1d344703d7d3e5b453a17e81e99ae489519757add01afaf2967" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.902300 4755 scope.go:117] "RemoveContainer" containerID="705f6219cf6e7229f8b2ed7393ea0a90aeac31b526f89efc1dd2e1e93d320b12" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.954074 4755 scope.go:117] "RemoveContainer" containerID="357db1fcf0376a2f7e5a8505188f8d07e91c1003508331ac8dd11eaeb9385e56" Mar 20 14:00:28 crc kubenswrapper[4755]: I0320 14:00:28.996288 4755 scope.go:117] "RemoveContainer" containerID="291d44fa2d759cebc2428335b7b6af1955b13cffd889d287d4c277526b8f07b6" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.033475 4755 scope.go:117] "RemoveContainer" containerID="035f6fa288ba835c95b145a119d04cf41e9e3a54cd012475c7a081a2276a5557" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.066350 4755 scope.go:117] "RemoveContainer" containerID="02c033d98a31eff9b6f2fd27a65dcce2cdba9ee50e31a547659840069ed55645" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.102380 4755 scope.go:117] "RemoveContainer" containerID="cea560be39cccd516b77d0d30da3bc9d64db06b7455423a8e22eacb2c87d57e2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.125565 4755 scope.go:117] "RemoveContainer" containerID="df8209f50d000896f89a57f1c660c7c93eb9377f2d931ebe77ace4e42c48c1f9" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.167131 4755 scope.go:117] "RemoveContainer" containerID="12434840c94f9e1507207814778783b795d528a9829a97a9b612e4417c0770d2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.198273 4755 scope.go:117] "RemoveContainer" containerID="60f5595fcede6ec841b414dc41e27b9bf107d18aaf78a0ca6302cf7b01dc28b2" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.218758 4755 scope.go:117] "RemoveContainer" containerID="b24cc29f4a3d45fd8adb655ff3aefc2dd43173d332839123e38cb6e66cc20cc0" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.257116 4755 scope.go:117] "RemoveContainer" containerID="e563b8f3d31e55e3468e71d1526b9d84a5066f3dfe1e07450115316e1267a59c" Mar 20 14:00:29 crc kubenswrapper[4755]: I0320 14:00:29.296222 4755 scope.go:117] "RemoveContainer" containerID="c657148ccc1a27d9b62255884d6a6e1d1019e179c3fce4621605696f07b5b3a8" Mar 20 14:00:32 crc kubenswrapper[4755]: I0320 14:00:32.029953 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 14:00:32 crc kubenswrapper[4755]: I0320 14:00:32.037556 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-cxr9p"] Mar 20 14:00:33 crc kubenswrapper[4755]: I0320 14:00:33.238891 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ea35a84-68ca-4490-b1d9-fa999ef63ebe" path="/var/lib/kubelet/pods/7ea35a84-68ca-4490-b1d9-fa999ef63ebe/volumes" Mar 20 14:00:35 crc kubenswrapper[4755]: I0320 14:00:35.226331 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:35 crc kubenswrapper[4755]: E0320 14:00:35.227236 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:00:38 crc kubenswrapper[4755]: I0320 14:00:38.025834 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 14:00:38 crc kubenswrapper[4755]: I0320 14:00:38.032939 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-dtggj"] Mar 20 14:00:39 crc kubenswrapper[4755]: I0320 14:00:39.238780 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c76f8c-7b76-4714-adac-6297b84d6492" path="/var/lib/kubelet/pods/95c76f8c-7b76-4714-adac-6297b84d6492/volumes" Mar 20 14:00:40 crc kubenswrapper[4755]: I0320 14:00:40.036434 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 14:00:40 crc kubenswrapper[4755]: I0320 14:00:40.051356 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-jrf8c"] Mar 20 14:00:41 crc kubenswrapper[4755]: I0320 14:00:41.246410 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25bd1da4-7fdb-4bd9-8405-a37fc6c18be0" path="/var/lib/kubelet/pods/25bd1da4-7fdb-4bd9-8405-a37fc6c18be0/volumes" Mar 20 14:00:48 crc kubenswrapper[4755]: I0320 14:00:48.226163 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:00:48 crc kubenswrapper[4755]: E0320 14:00:48.227075 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.143835 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.144831 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.144851 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.144871 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.144879 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.145104 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e921d7-2303-42ff-ac0e-89b8b15127e4" containerName="collect-profiles" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.145139 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="c07cef7a-c5f6-4f4b-8508-8d499928b255" containerName="oc" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.170629 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.170816 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.225976 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:00 crc kubenswrapper[4755]: E0320 14:01:00.226463 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287709 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287756 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287823 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.287937 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390229 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390406 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.390490 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.391611 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.399693 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.401313 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.402298 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.418037 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"keystone-cron-29566921-crbwk\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.517223 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:00 crc kubenswrapper[4755]: I0320 14:01:00.986607 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-crbwk"] Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.516576 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerStarted","Data":"e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065"} Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.516989 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerStarted","Data":"2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144"} Mar 20 14:01:01 crc kubenswrapper[4755]: I0320 14:01:01.534995 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566921-crbwk" podStartSLOduration=1.5349696800000001 podStartE2EDuration="1.53496968s" podCreationTimestamp="2026-03-20 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:01:01.533785437 +0000 UTC m=+1841.131717976" watchObservedRunningTime="2026-03-20 14:01:01.53496968 +0000 UTC m=+1841.132902249" Mar 20 14:01:03 crc kubenswrapper[4755]: I0320 14:01:03.538824 4755 generic.go:334] "Generic (PLEG): container finished" podID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerID="e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065" exitCode=0 Mar 20 14:01:03 crc kubenswrapper[4755]: I0320 14:01:03.538908 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerDied","Data":"e69184e885edefcaa6ad275a8ecefe11cce8281520d327bb131182884f5c2065"} Mar 20 14:01:04 crc kubenswrapper[4755]: I0320 14:01:04.928205 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082398 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082546 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082768 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.082818 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") pod \"f12b18c9-e142-46a6-8d46-e711cbceae11\" (UID: \"f12b18c9-e142-46a6-8d46-e711cbceae11\") " Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.090917 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.093074 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr" (OuterVolumeSpecName: "kube-api-access-bz6fr") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "kube-api-access-bz6fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.143995 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.147180 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data" (OuterVolumeSpecName: "config-data") pod "f12b18c9-e142-46a6-8d46-e711cbceae11" (UID: "f12b18c9-e142-46a6-8d46-e711cbceae11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188734 4755 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188783 4755 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188896 4755 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f12b18c9-e142-46a6-8d46-e711cbceae11-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.188910 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz6fr\" (UniqueName: \"kubernetes.io/projected/f12b18c9-e142-46a6-8d46-e711cbceae11-kube-api-access-bz6fr\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575430 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-crbwk" event={"ID":"f12b18c9-e142-46a6-8d46-e711cbceae11","Type":"ContainerDied","Data":"2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144"} Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575490 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5931d4d7cb8a12f57c6949c17d4b5fc5b8b0bc4e44b9e219403fc58255c144" Mar 20 14:01:05 crc kubenswrapper[4755]: I0320 14:01:05.575571 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-crbwk" Mar 20 14:01:13 crc kubenswrapper[4755]: I0320 14:01:13.226366 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:13 crc kubenswrapper[4755]: E0320 14:01:13.227558 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:24 crc kubenswrapper[4755]: I0320 14:01:24.226252 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:24 crc kubenswrapper[4755]: E0320 14:01:24.227306 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.043992 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.054565 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.066262 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.078159 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jqk4f"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.088302 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-79jc8"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.098341 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-9jv87"] Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.242803 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0deb3f1a-0cad-4429-9e79-38e5a0b38896" path="/var/lib/kubelet/pods/0deb3f1a-0cad-4429-9e79-38e5a0b38896/volumes" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.244004 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32a5606c-c777-4c0b-951c-6ce2e03edd7e" path="/var/lib/kubelet/pods/32a5606c-c777-4c0b-951c-6ce2e03edd7e/volumes" Mar 20 14:01:27 crc kubenswrapper[4755]: I0320 14:01:27.245241 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f395acec-f28b-4622-b349-127cf31ec92d" path="/var/lib/kubelet/pods/f395acec-f28b-4622-b349-127cf31ec92d/volumes" Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.052822 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.065496 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.074117 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.082546 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-ee84-account-create-update-jpmvf"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.090605 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-c99c-account-create-update-5s889"] Mar 20 14:01:28 crc kubenswrapper[4755]: I0320 14:01:28.097268 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-4e76-account-create-update-vjcr6"] Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.243350 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03accbff-bdf2-4256-bdf2-1b39d5485673" path="/var/lib/kubelet/pods/03accbff-bdf2-4256-bdf2-1b39d5485673/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.244106 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39991203-9b8d-4985-8e90-b3d1772f6b8f" path="/var/lib/kubelet/pods/39991203-9b8d-4985-8e90-b3d1772f6b8f/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.244822 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86" path="/var/lib/kubelet/pods/523cc2e8-d2eb-4f35-8aa2-d9212fdb5e86/volumes" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.533819 4755 scope.go:117] "RemoveContainer" containerID="d587742af74b9c3de668ae1984b0ef66f500518188cd1397356a05382499f597" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.563118 4755 scope.go:117] "RemoveContainer" containerID="08f6171aa1699ccfb785281f28dd2eaadb1c4c9db74aca0907d1d3cde8d623f6" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.620176 4755 scope.go:117] "RemoveContainer" containerID="a9a2e83547c76638fc8671a99e0bfb3517ad85689f2490760b78b38ac376cdd5" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.663089 4755 scope.go:117] "RemoveContainer" containerID="34bb019b6b2edd84278525de71c1498dee8194d1e832aa7f19aa00c20a976f27" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.715316 4755 scope.go:117] "RemoveContainer" containerID="f48f17b3619a61fc0cb88d69afecc573c1b266d447ea55b0cd7bd4a5a7acc1ba" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.752533 4755 scope.go:117] "RemoveContainer" containerID="8b5d8e206bbb1db488a0f5fd4025d2bbe54a60b5752cdd8ca8cc436020785363" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.834284 4755 scope.go:117] "RemoveContainer" containerID="88bd4bda57907a807f570d789758ec613bca12afd4a1c3728186284b0e247c1f" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.884191 4755 scope.go:117] "RemoveContainer" containerID="d8fd4a3d3a925ddfd7d83bb37c8a0048b0ca734ccc8e77bf2d187f2f3e9192d3" Mar 20 14:01:29 crc kubenswrapper[4755]: I0320 14:01:29.929114 4755 scope.go:117] "RemoveContainer" containerID="d596f288ad3d6c89ebb0bba48d21ab0517721798e3b70088d016e75a1dca8da7" Mar 20 14:01:39 crc kubenswrapper[4755]: I0320 14:01:39.226343 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:39 crc kubenswrapper[4755]: E0320 14:01:39.227525 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:50 crc kubenswrapper[4755]: I0320 14:01:50.225806 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:01:50 crc kubenswrapper[4755]: E0320 14:01:50.226655 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:01:56 crc kubenswrapper[4755]: I0320 14:01:56.055372 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 14:01:56 crc kubenswrapper[4755]: I0320 14:01:56.066768 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-mbd9g"] Mar 20 14:01:57 crc kubenswrapper[4755]: I0320 14:01:57.244299 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faef786e-b221-4fff-8d48-42b8163ed86a" path="/var/lib/kubelet/pods/faef786e-b221-4fff-8d48-42b8163ed86a/volumes" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.149286 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:00 crc kubenswrapper[4755]: E0320 14:02:00.150122 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.150141 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.150379 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="f12b18c9-e142-46a6-8d46-e711cbceae11" containerName="keystone-cron" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.151457 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155028 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155209 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.155790 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.167188 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.301749 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.403724 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.430822 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"auto-csr-approver-29566922-9wz8j\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:00 crc kubenswrapper[4755]: I0320 14:02:00.525121 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.046528 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-9wz8j"] Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.052169 4755 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:02:01 crc kubenswrapper[4755]: I0320 14:02:01.244326 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerStarted","Data":"61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4"} Mar 20 14:02:03 crc kubenswrapper[4755]: I0320 14:02:03.271149 4755 generic.go:334] "Generic (PLEG): container finished" podID="a0bc6168-1eb1-41f8-8921-70564488dc62" containerID="68a0b051f32e2cc882ef284df84e75dea8c9605ead333a40bfbbc39dbec1e0c1" exitCode=0 Mar 20 14:02:03 crc kubenswrapper[4755]: I0320 14:02:03.271242 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerDied","Data":"68a0b051f32e2cc882ef284df84e75dea8c9605ead333a40bfbbc39dbec1e0c1"} Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.226282 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:04 crc kubenswrapper[4755]: E0320 14:02:04.226639 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.663613 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.803547 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") pod \"a0bc6168-1eb1-41f8-8921-70564488dc62\" (UID: \"a0bc6168-1eb1-41f8-8921-70564488dc62\") " Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.812797 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb" (OuterVolumeSpecName: "kube-api-access-s6qvb") pod "a0bc6168-1eb1-41f8-8921-70564488dc62" (UID: "a0bc6168-1eb1-41f8-8921-70564488dc62"). InnerVolumeSpecName "kube-api-access-s6qvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:04 crc kubenswrapper[4755]: I0320 14:02:04.905074 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6qvb\" (UniqueName: \"kubernetes.io/projected/a0bc6168-1eb1-41f8-8921-70564488dc62-kube-api-access-s6qvb\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290463 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" event={"ID":"a0bc6168-1eb1-41f8-8921-70564488dc62","Type":"ContainerDied","Data":"61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4"} Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290507 4755 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61671a28fbd88a8f2f787f0c45e8cba7c10eba3cd554b595c2415e7908a119c4" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.290520 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-9wz8j" Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.781849 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 14:02:05 crc kubenswrapper[4755]: I0320 14:02:05.793871 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-jf8sk"] Mar 20 14:02:07 crc kubenswrapper[4755]: I0320 14:02:07.247166 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c74d4c86-05c3-4ac3-a18e-cb75b4d95559" path="/var/lib/kubelet/pods/c74d4c86-05c3-4ac3-a18e-cb75b4d95559/volumes" Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.041729 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.050488 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vz8fw"] Mar 20 14:02:15 crc kubenswrapper[4755]: I0320 14:02:15.245337 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ff73477-b65b-4362-938c-94b1bb1f51b0" path="/var/lib/kubelet/pods/2ff73477-b65b-4362-938c-94b1bb1f51b0/volumes" Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.031964 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.039032 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbtvj"] Mar 20 14:02:16 crc kubenswrapper[4755]: I0320 14:02:16.226371 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:16 crc kubenswrapper[4755]: E0320 14:02:16.226899 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:17 crc kubenswrapper[4755]: I0320 14:02:17.240337 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cadbdc7c-ed66-43d7-82ee-d797beb959a8" path="/var/lib/kubelet/pods/cadbdc7c-ed66-43d7-82ee-d797beb959a8/volumes" Mar 20 14:02:27 crc kubenswrapper[4755]: I0320 14:02:27.227196 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:27 crc kubenswrapper[4755]: E0320 14:02:27.228382 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.066048 4755 scope.go:117] "RemoveContainer" containerID="4e04d78dce32103bd7e68191d5264cd6d0164ee782baa2facb4d70b046882e6a" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.116886 4755 scope.go:117] "RemoveContainer" containerID="ce9d805a2c4c50680c23940622d796b78d00ed9243eb4db8b57356fad93506d8" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.168535 4755 scope.go:117] "RemoveContainer" containerID="109e091277ccdd700aa371cc8183b41a8cbcb4b0999cde7a25e6711c4cbe8c28" Mar 20 14:02:30 crc kubenswrapper[4755]: I0320 14:02:30.217460 4755 scope.go:117] "RemoveContainer" containerID="8cfd090c83de7fa8769c21cee82a39a3d4da33f756361f7726ba02291aa9d718" Mar 20 14:02:42 crc kubenswrapper[4755]: I0320 14:02:42.270930 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:42 crc kubenswrapper[4755]: E0320 14:02:42.271865 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:55 crc kubenswrapper[4755]: I0320 14:02:55.226183 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:02:55 crc kubenswrapper[4755]: E0320 14:02:55.229298 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:02:58 crc kubenswrapper[4755]: I0320 14:02:58.055782 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 14:02:58 crc kubenswrapper[4755]: I0320 14:02:58.063277 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7fs4m"] Mar 20 14:02:59 crc kubenswrapper[4755]: I0320 14:02:59.245647 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557c5385-782c-410a-a371-b27f41d88a47" path="/var/lib/kubelet/pods/557c5385-782c-410a-a371-b27f41d88a47/volumes" Mar 20 14:03:10 crc kubenswrapper[4755]: I0320 14:03:10.226467 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:03:10 crc kubenswrapper[4755]: E0320 14:03:10.227527 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.668748 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:23 crc kubenswrapper[4755]: E0320 14:03:23.669575 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bc6168-1eb1-41f8-8921-70564488dc62" containerName="oc" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.669586 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bc6168-1eb1-41f8-8921-70564488dc62" containerName="oc" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.669800 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bc6168-1eb1-41f8-8921-70564488dc62" containerName="oc" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.671056 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.679226 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.857903 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.857956 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpf8d\" (UniqueName: \"kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.858112 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.960388 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.960435 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpf8d\" (UniqueName: \"kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.960504 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.960941 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.961084 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:23 crc kubenswrapper[4755]: I0320 14:03:23.983990 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpf8d\" (UniqueName: \"kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d\") pod \"community-operators-n79cj\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:24 crc kubenswrapper[4755]: I0320 14:03:24.000688 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:24 crc kubenswrapper[4755]: I0320 14:03:24.226312 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:03:24 crc kubenswrapper[4755]: E0320 14:03:24.226703 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:03:24 crc kubenswrapper[4755]: I0320 14:03:24.562314 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:25 crc kubenswrapper[4755]: I0320 14:03:25.192443 4755 generic.go:334] "Generic (PLEG): container finished" podID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerID="114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec" exitCode=0 Mar 20 14:03:25 crc kubenswrapper[4755]: I0320 14:03:25.192544 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerDied","Data":"114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec"} Mar 20 14:03:25 crc kubenswrapper[4755]: I0320 14:03:25.192731 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerStarted","Data":"93a5b04c1c24d8e41e24cd5cb7a24365defd792345924a01a07b22ec033760e2"} Mar 20 14:03:26 crc kubenswrapper[4755]: I0320 14:03:26.201716 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerStarted","Data":"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849"} Mar 20 14:03:27 crc kubenswrapper[4755]: I0320 14:03:27.210725 4755 generic.go:334] "Generic (PLEG): container finished" podID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerID="056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849" exitCode=0 Mar 20 14:03:27 crc kubenswrapper[4755]: I0320 14:03:27.210824 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerDied","Data":"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849"} Mar 20 14:03:28 crc kubenswrapper[4755]: I0320 14:03:28.245293 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerStarted","Data":"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894"} Mar 20 14:03:28 crc kubenswrapper[4755]: I0320 14:03:28.272804 4755 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n79cj" podStartSLOduration=2.820876627 podStartE2EDuration="5.272777836s" podCreationTimestamp="2026-03-20 14:03:23 +0000 UTC" firstStartedPulling="2026-03-20 14:03:25.196172333 +0000 UTC m=+1984.794104862" lastFinishedPulling="2026-03-20 14:03:27.648073512 +0000 UTC m=+1987.246006071" observedRunningTime="2026-03-20 14:03:28.265929062 +0000 UTC m=+1987.863861591" watchObservedRunningTime="2026-03-20 14:03:28.272777836 +0000 UTC m=+1987.870710395" Mar 20 14:03:30 crc kubenswrapper[4755]: I0320 14:03:30.328921 4755 scope.go:117] "RemoveContainer" containerID="93d3611d6dc9b7879481031c0ca175844e4e535b8456bd316e081a243992d2fa" Mar 20 14:03:34 crc kubenswrapper[4755]: I0320 14:03:34.000927 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:34 crc kubenswrapper[4755]: I0320 14:03:34.001571 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:34 crc kubenswrapper[4755]: I0320 14:03:34.083964 4755 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:34 crc kubenswrapper[4755]: I0320 14:03:34.363612 4755 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:34 crc kubenswrapper[4755]: I0320 14:03:34.410477 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:35 crc kubenswrapper[4755]: I0320 14:03:35.226277 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:03:35 crc kubenswrapper[4755]: E0320 14:03:35.226970 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:03:36 crc kubenswrapper[4755]: I0320 14:03:36.330026 4755 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n79cj" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="registry-server" containerID="cri-o://b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894" gracePeriod=2 Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.304820 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.348612 4755 generic.go:334] "Generic (PLEG): container finished" podID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerID="b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894" exitCode=0 Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.348702 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerDied","Data":"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894"} Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.348738 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n79cj" event={"ID":"b89f522d-7355-47bf-a015-2e8a7569e1fe","Type":"ContainerDied","Data":"93a5b04c1c24d8e41e24cd5cb7a24365defd792345924a01a07b22ec033760e2"} Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.348770 4755 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n79cj" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.348774 4755 scope.go:117] "RemoveContainer" containerID="b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.379426 4755 scope.go:117] "RemoveContainer" containerID="056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.402075 4755 scope.go:117] "RemoveContainer" containerID="114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.423079 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities\") pod \"b89f522d-7355-47bf-a015-2e8a7569e1fe\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.423161 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content\") pod \"b89f522d-7355-47bf-a015-2e8a7569e1fe\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.423277 4755 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpf8d\" (UniqueName: \"kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d\") pod \"b89f522d-7355-47bf-a015-2e8a7569e1fe\" (UID: \"b89f522d-7355-47bf-a015-2e8a7569e1fe\") " Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.423952 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities" (OuterVolumeSpecName: "utilities") pod "b89f522d-7355-47bf-a015-2e8a7569e1fe" (UID: "b89f522d-7355-47bf-a015-2e8a7569e1fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.430090 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d" (OuterVolumeSpecName: "kube-api-access-bpf8d") pod "b89f522d-7355-47bf-a015-2e8a7569e1fe" (UID: "b89f522d-7355-47bf-a015-2e8a7569e1fe"). InnerVolumeSpecName "kube-api-access-bpf8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.473853 4755 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b89f522d-7355-47bf-a015-2e8a7569e1fe" (UID: "b89f522d-7355-47bf-a015-2e8a7569e1fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.490325 4755 scope.go:117] "RemoveContainer" containerID="b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894" Mar 20 14:03:37 crc kubenswrapper[4755]: E0320 14:03:37.490963 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894\": container with ID starting with b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894 not found: ID does not exist" containerID="b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.490989 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894"} err="failed to get container status \"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894\": rpc error: code = NotFound desc = could not find container \"b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894\": container with ID starting with b87ab53eb52d8b8d84ed4553797374e719a6f52a4f160447ea73d95578df1894 not found: ID does not exist" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.491009 4755 scope.go:117] "RemoveContainer" containerID="056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849" Mar 20 14:03:37 crc kubenswrapper[4755]: E0320 14:03:37.491364 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849\": container with ID starting with 056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849 not found: ID does not exist" containerID="056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.491385 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849"} err="failed to get container status \"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849\": rpc error: code = NotFound desc = could not find container \"056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849\": container with ID starting with 056bea7dc8904431f8517f4ca7afdc8172f63e283c49e3fe18e726a6fc973849 not found: ID does not exist" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.491399 4755 scope.go:117] "RemoveContainer" containerID="114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec" Mar 20 14:03:37 crc kubenswrapper[4755]: E0320 14:03:37.491709 4755 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec\": container with ID starting with 114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec not found: ID does not exist" containerID="114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.491825 4755 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec"} err="failed to get container status \"114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec\": rpc error: code = NotFound desc = could not find container \"114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec\": container with ID starting with 114fd778bd644838fdbfbfa065bf60af55f98a012f7ad2d431c0eab425fd9bec not found: ID does not exist" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.525190 4755 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpf8d\" (UniqueName: \"kubernetes.io/projected/b89f522d-7355-47bf-a015-2e8a7569e1fe-kube-api-access-bpf8d\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.525219 4755 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.525230 4755 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b89f522d-7355-47bf-a015-2e8a7569e1fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.687263 4755 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:37 crc kubenswrapper[4755]: I0320 14:03:37.697450 4755 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n79cj"] Mar 20 14:03:39 crc kubenswrapper[4755]: I0320 14:03:39.245401 4755 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" path="/var/lib/kubelet/pods/b89f522d-7355-47bf-a015-2e8a7569e1fe/volumes" Mar 20 14:03:50 crc kubenswrapper[4755]: I0320 14:03:50.226114 4755 scope.go:117] "RemoveContainer" containerID="0ebe0d49301cf40a11626c6ae5bd652aba471d8831c5aedc1a2166a7bbf00154" Mar 20 14:03:50 crc kubenswrapper[4755]: E0320 14:03:50.227242 4755 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xmn6s_openshift-machine-config-operator(3eb406f6-1a26-4eea-84ac-e55f5232900c)\"" pod="openshift-machine-config-operator/machine-config-daemon-xmn6s" podUID="3eb406f6-1a26-4eea-84ac-e55f5232900c" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.169571 4755 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t9ncj"] Mar 20 14:04:00 crc kubenswrapper[4755]: E0320 14:04:00.170705 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="extract-content" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.170720 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="extract-content" Mar 20 14:04:00 crc kubenswrapper[4755]: E0320 14:04:00.170752 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="extract-utilities" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.170760 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="extract-utilities" Mar 20 14:04:00 crc kubenswrapper[4755]: E0320 14:04:00.170776 4755 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="registry-server" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.170785 4755 state_mem.go:107] "Deleted CPUSet assignment" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="registry-server" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.171021 4755 memory_manager.go:354] "RemoveStaleState removing state" podUID="b89f522d-7355-47bf-a015-2e8a7569e1fe" containerName="registry-server" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.171814 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t9ncj" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.173749 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t9ncj"] Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.194952 4755 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-vdtph" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.195241 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.195399 4755 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.297267 4755 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xggmt\" (UniqueName: \"kubernetes.io/projected/b7e43534-1947-48a0-ae2b-7334bfd94e5e-kube-api-access-xggmt\") pod \"auto-csr-approver-29566924-t9ncj\" (UID: \"b7e43534-1947-48a0-ae2b-7334bfd94e5e\") " pod="openshift-infra/auto-csr-approver-29566924-t9ncj" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.399676 4755 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xggmt\" (UniqueName: \"kubernetes.io/projected/b7e43534-1947-48a0-ae2b-7334bfd94e5e-kube-api-access-xggmt\") pod \"auto-csr-approver-29566924-t9ncj\" (UID: \"b7e43534-1947-48a0-ae2b-7334bfd94e5e\") " pod="openshift-infra/auto-csr-approver-29566924-t9ncj" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.435200 4755 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xggmt\" (UniqueName: \"kubernetes.io/projected/b7e43534-1947-48a0-ae2b-7334bfd94e5e-kube-api-access-xggmt\") pod \"auto-csr-approver-29566924-t9ncj\" (UID: \"b7e43534-1947-48a0-ae2b-7334bfd94e5e\") " pod="openshift-infra/auto-csr-approver-29566924-t9ncj" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.517974 4755 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-t9ncj" Mar 20 14:04:00 crc kubenswrapper[4755]: I0320 14:04:00.796599 4755 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-t9ncj"] Mar 20 14:04:00 crc kubenswrapper[4755]: W0320 14:04:00.801579 4755 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7e43534_1947_48a0_ae2b_7334bfd94e5e.slice/crio-664976d1f40992b60346540a47e08d73039237d0c64fbfd9aa6257d84a991ff7 WatchSource:0}: Error finding container 664976d1f40992b60346540a47e08d73039237d0c64fbfd9aa6257d84a991ff7: Status 404 returned error can't find the container with id 664976d1f40992b60346540a47e08d73039237d0c64fbfd9aa6257d84a991ff7 Mar 20 14:04:01 crc kubenswrapper[4755]: I0320 14:04:01.667736 4755 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-t9ncj" event={"ID":"b7e43534-1947-48a0-ae2b-7334bfd94e5e","Type":"ContainerStarted","Data":"664976d1f40992b60346540a47e08d73039237d0c64fbfd9aa6257d84a991ff7"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157251733024456 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157251734017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157245373016521 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157245373015471 5ustar corecore